Upload file from REST API Response directly to ADLS
i have usecase to call rest API and then return response file with base64Is it possible save the response directly to ADLS without convert it to file first ?
- 1208 Views
- 0 replies
- 0 kudos
i have usecase to call rest API and then return response file with base64Is it possible save the response directly to ADLS without convert it to file first ?
Unable to run query- Insert into with 'Nan' value in SQL Editor.Query :-Insert into ABC with values('xyz',123,Nan);Error :-org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation cannot be cast to org.apache.spark.sql.execution.datasources...
We are using a serverless SQL warehouse and managed tables in unity catalog in Azure Databricks. When usind the designated catalog tab, I can see and filter for functions, but when I am developing in a notebook, there are only tables and views visibl...
HI All,I use the latest version of Databricks (2.6.34) in my Mulesoft Application to and the code it deployed successfully in local and while testing i have observed the below mentioned error.Kindly check and help resolving the issue.at com.databrick...
We have two environments for our Azure Databricks. Dev and Prod. We had clusters created and tested in Dev environment, then they were exported to the prod environment through APIs. The clusters in Dev are performing as expected. Whereas, the cluster...
Both Prod and Dev are connect to unity catalog and I am working with the same table in both the envs. Can something done during the creation of workspace itself affect the performance of clusters? Do clusters update to latest Databricks runtime versi...
Hi,My scenario is this: autoloader loads data from ADLS once a day.data are merged into the bronze zone.bronze table has CDF turned on.read CDF and merge into the silver.all works fine, but when there is not new file for autoloader to process, CDF ta...
Whenever i try to use a SQL Wharehouse serverless cluster on a power bi dataset it does not refresh on the power bi online service. It does work normally for other types of databricks clusters. The catalog is being defined on the power query import.I...
Hi,We have the exact same issue, even if we specify the catalog in the connection parameters.However, Oauth authentication through a dataflow (instead of from Power Query Desktop) works fine. In Desktop we are in version 2.122.746.0, but the issue is...
we have enabled system tables, under system tables we are able to see compute schema, under that we are not able to see cluster and node_type in azure databricks. do we have any limitations for above tables
Hello! I'm following this documentation : https://docs.databricks.com/en/getting-started/community-edition.htmlAnd I can't succeed the step 4) because I don't receive any email to verify my email address. I have checked the spam, no email.Is there so...
@farah please check the junk mail folder and also message quarantine if you are using M$ services.The verification mail goes out fine. However, it may show up in Junk | Quarantine.
Hi all,I'm trying to do creating one job cluster with one configuration or specification which has a workflow and this workflow needs to have 3 dependent tasks as a straight line. For example, t1->t2->t3. In databricks there are some constraints also...
Hello,We are using DLT pipelines for many of our jobs with notifications on failures to slack.Wondering if there is a clean way to disable the alerts when in development mode. It does make sense to have it turned off in dev, doesn't it?
1. How to use cloudFiles.backfillInterval option in a notebook?2. Does It need to be any set of the property?3. Where is exactly placed readstream portion of the code or writestream portion of the code?4. Do you have any sample code?5. Where we find ...
1.Is the following code correct for specifying the .option("cloudFiles.backfillInterval", 300)?df = spark.readStream.format("cloudFiles") \.option("cloudFiles.format", "csv") \.option("cloudFiles.schemaLocation", f"dbfs:/FileStore/xyz/back_fill_opti...
I am unable to login in databricks community account, my email was "sureshsuthar1251@gmail.com". When i try to login it says account not found.
I am testing Databricks with non-AWS S3 object storage and want to test it with Databricks on Azure and Databricks on AWS. My Databricks account is currently using Databricks on AWS with metadata and a single node compute running. Can the same accou...
Thank you Kaniz for posting the link. Looking at that, I believe the answer is:This is not possible in Databricks for now.
@Cert-Team I have registered for Databricks certified data engineer associate exam and I have done all biometric and other prerequisites required and launched my exam. While writing the exam, it's exited twice with some technical issue although I don...
Thank you for your support. I have given my test.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group