- 453 Views
- 0 replies
- 0 kudos
I am using terrafom to do databricks workspace configuration and while mounting 6 buckets if duration of mount is bigger than 20 min I get timeout. Is it possible to change the timeout ? thanksHoratiu
- 453 Views
- 0 replies
- 0 kudos
by
DimaP
• New Contributor II
- 375 Views
- 0 replies
- 0 kudos
Is it sufficient to use the checkpoint directory with write-ahead logs?BTW. I use Kafka connector to read data from EventHub
- 375 Views
- 0 replies
- 0 kudos
by
cblock
• New Contributor III
- 1185 Views
- 3 replies
- 3 kudos
So, in this case our jobs are deployed from our development workspace to our isolated testing workspace via an automated Azure DevOps pipeline. As such, they are created (and thus run as) a service account user.Recently we made the switch to using gi...
- 1185 Views
- 3 replies
- 3 kudos
Latest Reply
Hi @Chris Block​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...
2 More Replies
- 570 Views
- 0 replies
- 0 kudos
i am able to encrypt and decrypt the daat in multiple ways and able to save the encrypted parquet file, but i want to decrypt the data if the user has specific permission otherwise he will get the encrypted data,.is there any permanent solution to de...
- 570 Views
- 0 replies
- 0 kudos
- 429 Views
- 0 replies
- 0 kudos
This SQL statement works fine by itself SELECT COUNT(1) FROM tablea f INNER JOIN tableb t ON lower(f.col1) = t.col1but if I want to use it inside a function:​CREATE OR REPLACE FUNCTION fn_abc(var1 ...
- 429 Views
- 0 replies
- 0 kudos
- 1099 Views
- 2 replies
- 2 kudos
We're using the following method (generated by using dbx) to access dbutils, e.g. to retrieve parameters from secret scopes: @staticmethod
def _get_dbutils(spark: SparkSession) -> "dbutils":
try:
from pyspark.dbutils import...
- 1099 Views
- 2 replies
- 2 kudos
Latest Reply
We have something similar in our code. This worked using runtime 13 until last week. Also the Machine Learning DBR doesn't work either.
1 More Replies
- 3199 Views
- 4 replies
- 3 kudos
I feel like I am going crazy with this. I have tested a data pipeline on my standard compute cluster. I am loading new files as batch from a Google Cloud Storage bucket. Autoloader works exactly as expected from my notebook on my compute cluster. The...
- 3199 Views
- 4 replies
- 3 kudos
Latest Reply
I found the issue. I describe the solution in the following SO post. https://stackoverflow.com/questions/76287095/databricks-autoloader-works-on-compute-cluster-but-does-not-work-within-a-task/76313794#76313794
3 More Replies
by
g96g
• New Contributor III
- 552 Views
- 1 replies
- 0 kudos
Im having a hard time to convert below function from SSMS to databricks function. Any help would be appreciated! CREATE FUNCTION [dbo].[MaxOf5Values] (@D1 [int],@D2 [int],@D3 [int],@D4 [int],@D5 [int]) RETURNS int
AS
BEGIN
DECLARE @Result int
...
- 552 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Givi Salu​ ,​Please refer to this link that will help you convert this function.
- 715 Views
- 1 replies
- 0 kudos
We are having the files that needs to be loaded into the delta tableNow we want to perform some transformation on the files and load that into the tableWhat we didCreate a Spark DF from that fileApply transformation on the DFCreate a temp view from t...
- 715 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Sravan Kumar Mohanraj​ ,Yes, you can use copy query in this case your temp_view will be source.For more info, please visit these links.
- 1159 Views
- 2 replies
- 3 kudos
I am trying to run some API calls to the account console. I tried with every syntax possible encoded and decoded to get the call successfully but it returns a "user not authenticated" error. But when I tried with the Account admin it worked. I need t...
- 1159 Views
- 2 replies
- 3 kudos
Latest Reply
Hi Venkat, that sounds like a good idea. Thanks
1 More Replies
- 296 Views
- 0 replies
- 1 kudos
Welcome to Chennai User Group Community! We are thrilled to have you join our vibrant and enthusiastic community of users! Whether you are a seasoned expert or a newcomer, this group is the perfect place for you to connect, learn, and grow alongside...
- 296 Views
- 0 replies
- 1 kudos
- 1566 Views
- 3 replies
- 4 kudos
I have scala function as below, i am unable to understand how to write a scala jar with the same, please find below code i have used Enforcing Column-Level Encryption - Databrick %scala import com.macasaet.fernet.{Key, StringValidator, Token}import o...
- 1566 Views
- 3 replies
- 4 kudos
Latest Reply
I had to finally create the jar using teh intellij and sbt iconfiguration on the same env. and then installed the jar in the cluster it worked
2 More Replies
- 1129 Views
- 2 replies
- 0 kudos
My Code:-- CREATE OR REPLACE TEMPORARY VIEW preprocessed_source ASSELECT Key_ID, Distributor_ID, Customer_ID, Customer_Name, ChannelFROM integr_masterdata.Customer_Master;-- Step 2: Perform the merge operation using the preprocessed source table...
- 1129 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Prashant Joshi​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us s...
1 More Replies
- 1694 Views
- 6 replies
- 6 kudos
https://www.databricks.com/notebooks/delta-lake-cdf.htmlI am trying to understand the above article. Could someone explain to be the below questions?a) From SELECT * FROM table_changes('gold_consensus_eps', 2)why is consensus_eps values of 2.1 and 2....
- 1694 Views
- 6 replies
- 6 kudos
Latest Reply
Hi @THIAM HUAT TAN​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answer...
5 More Replies