- 1219 Views
- 3 replies
- 1 kudos
I'm trying to access to a Databricks SQL Warehouse with python. I'm able to connect with a token on a Compute Instance on Azure Machine Learning. It's a VM with conda installed, I create an env in python 3.10.from databricks import sql as dbsql
dbsq...
- 1219 Views
- 3 replies
- 1 kudos
Latest Reply
The issue was that the new version of databricks-sql-connector (3.0.1) does not handle well error messages. So It gave a generic error and a timeout where it should have given me 403 and instant error message without a 900 second timeout.https://gith...
2 More Replies
- 344 Views
- 1 replies
- 0 kudos
Hi,There is a set of .csv/.txt files in the storage container ie Azure Blob Storage/ Azure Storage Gen 2. I would like to ingest the files to Databricks. Dataset,LinkedServices was created on both ends. Also an all purpose cluster was created in Bric...
- 344 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @menonshiji, The error in the Databricks Pipeline indicates that the Azure Blob Storage container is inaccessible due to the lack of proper credentials. To resolve this issue, make sure to include the necessary credentials in the Spark configurati...
- 2151 Views
- 3 replies
- 3 kudos
woahhh #Excel plug in for #DeltaSharing.Now I can import delta tables directly into my spreadsheet using Delta Sharing.It puts the power of #DeltaLake into the hands of millions of business users.What does this mean?Imagine a data provider delivering...
- 2151 Views
- 3 replies
- 3 kudos
Latest Reply
If you have any uncertainties, feel free to inquire here or connect with me on my LinkedIn profile for further assistance.https://whatsgbpro.org/
2 More Replies
- 281 Views
- 1 replies
- 0 kudos
I have a JavRDD with complex nested xml content that I want to unmarshall using JAXB and get the data in to java objects. Can anyone please help with how can I achieve?Thanks
- 281 Views
- 1 replies
- 0 kudos
Latest Reply
I hope this should workJavaPairRDD<String, PortableDataStream> jrdd = javaSparkContext.binaryFiles("<path_to_file>");Map<String, PortableDataStream> mp = jrdd.collectAsMap();OutputStream os = new FileOutputStream(f);mp.values().forEach(pd -> { try...
by
JensH
• New Contributor III
- 2590 Views
- 3 replies
- 2 kudos
Hi,I would like to use the new "Job as Task" feature but Im having trouble to pass values.ScenarioI have a workflow job which contains 2 tasks.Task_A (type "Notebook"): Read data from a table and based on the contents decide, whether the workflow in ...
- 2590 Views
- 3 replies
- 2 kudos
Latest Reply
I found the following information:
value is the value for this task value’s key. This command must be able to represent the value internally in JSON format. The size of the JSON representation of the value cannot exceed 48 KiB.You can refer to https...
2 More Replies
- 392 Views
- 1 replies
- 0 kudos
Hi, When a Job is running, I would like to change the parameters with an API call.I know that I can set parameters value from API when I start a job from API, or that I can update the default value if the job isn't running, but I didn't find an API c...
- 392 Views
- 1 replies
- 0 kudos
Latest Reply
No, there is currently no option to change parameters while the job is running, from the UI you will be able to modify them but it wont affect the current run, it will be applied on the new job runs you trigger.
- 5679 Views
- 2 replies
- 3 kudos
What is the difference between Databricks Auto-Loader and Delta Live Tables? Both seem to manage ETL for you but I'm confused on where to use one vs. the other.
- 5679 Views
- 2 replies
- 3 kudos
Latest Reply
You say "...__would__ be a piece..." and "...DLT __would__ pick up...".Is DLT built upon AL?
1 More Replies
- 8083 Views
- 11 replies
- 4 kudos
I have successfully passed the test after completion of the course with 95%. But I have'nt recieved any badge from your side as promised. I have been provided with a certificate which looks fake by itself. I need to post my credentials on Linkedin wi...
- 8083 Views
- 11 replies
- 4 kudos
Latest Reply
Even I'm facing similar issue. I have completed the training and the quiz successful and able to download a course completion certificate. Certificate doesn't have any ID and looking very generic and fake. Have signed up for the https://credentials.d...
10 More Replies
- 1009 Views
- 1 replies
- 0 kudos
Hi!I am pulling data from a Blob storage to Databrick using Autoloader. This process is working well for almost 10 resources, but for a specific one I am getting this error java.lang.NullPointerException.Looks like this issue in when I connect to th...
- 1009 Views
- 1 replies
- 0 kudos
Latest Reply
@Maxi1693 - The value for the schemaEvolutionMode should be a string. could you please try changing the below from
.option("cloudFiles.schemaEvolutionMode", None)
to
.option("cloudFiles.schemaEvolutionMode", "none")
and let us know.
Refe...
- 896 Views
- 5 replies
- 1 kudos
- 896 Views
- 5 replies
- 1 kudos
Latest Reply
Yes, I meant to set it to None. Is the issue specific to any particular cluster? Or do you see the issue with all the clusters in your workspace?
4 More Replies
- 573 Views
- 2 replies
- 1 kudos
Hi Guys, I am new to the Delta pipeline. I have created a pipeline and now when i try to run the pipeline i get the error message "PERMISSION_DENIED: You are not authorized to create clusters. Please contact your administrator" even though I can crea...
- 573 Views
- 2 replies
- 1 kudos
Latest Reply
Thank you for responding @Palash01 . thanks for giving me the direction so to get around it i had to get permission to "unrestricted cluster creation".
1 More Replies
- 545 Views
- 3 replies
- 0 kudos
Do you know why the userIdentity is anonymous in AWS Cloudtail's logs even though I have specified an instance profile?
- 545 Views
- 3 replies
- 0 kudos
Latest Reply
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
2 More Replies
- 1993 Views
- 5 replies
- 2 kudos
If this isn't the right spot to post this, please move it or refer me to the right area.I recently learned about the "_metadata.file_name". It's not quite what I need.I'm creating a new table in DataBricks and want to add a USR_File_Name column cont...
- 1993 Views
- 5 replies
- 2 kudos
Latest Reply
Hi, Could you please elaborate more on the expectation here?
4 More Replies
- 236 Views
- 1 replies
- 0 kudos
Hy guys,How can I get the pricing of cluster types (standard_D*, standard_E*, standart_F*, etc.) ?Im doing a study to decrease the price of my actual cluster.Have any idea ?Thank you, thank you
- 236 Views
- 1 replies
- 0 kudos
Latest Reply
Hey, you can use the pricing calculator here: https://www.databricks.com/product/pricing/product-pricing/instance-types
- 1283 Views
- 4 replies
- 1 kudos
Hi, I'm trying to create a calendar dimension including a fiscal year with a fiscal start of April 1. I'm using the fiscalyear library and am setting the start to month 4 but it insists on setting April to month 7.runtime 12.1My code snipet is:start_...
- 1283 Views
- 4 replies
- 1 kudos
Latest Reply
import fiscalyear
import datetime
def get_fiscal_date(year,month,day):
fiscalyear.setup_fiscal_calendar(start_month=4)
v_fiscal_month=fiscalyear.FiscalDateTime(year, month, day).fiscal_month #To get the Fiscal Month
v_fiscal_quarter=fiscalyea...
3 More Replies