- 2893 Views
- 6 replies
- 1 kudos
DataFrame to CSV write has issues due to multiple commas inside an row value
Hi alliam working on a data containing JSON fields with embedded commas into CSV format. iam facing challenges due to the commas within the JSON being misinterpreted as column delimiters during the conversion process.i tried several methods to modify...
- 2893 Views
- 6 replies
- 1 kudos
- 1 kudos
Hi Sai, I assume that the problem comes not from the PySpark, but from Excel. I tried to reproduce the error and didn't find the way - that a good thing, right ? Please try the following : df.write.format("csv").save("/Volumes/<my_catalog_name>/<m...
- 1 kudos
- 1205 Views
- 1 replies
- 0 kudos
Access Delta sharing from Azure Data Factory
I recently got access to delta sharing and I am looking to access the data from the tables in share through ADF. I used linked services such as REST API and HTTP and successfully established connection using the credential file token and http path, h...
- 1205 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey, I think you'll need to use a Databricks activity instead of Copy See : https://learn.microsoft.com/en-us/azure/data-factory/connector-overview#integrate-with-more-data-storeshttps://learn.microsoft.com/en-us/azure/data-factory/transform-data-dat...
- 0 kudos
- 2072 Views
- 4 replies
- 1 kudos
Redefine ETL strategy with pypskar approach
Hey everyone!I've some previous experience with Data Engineering, but totally new in Databricks and Delta Tables.Starting this thread hoping to ask some questions and asking for help on how to design a process.So I have essentially 2 delta tables (sa...
- 2072 Views
- 4 replies
- 1 kudos
- 1 kudos
Hi @databird , You can review the code of each demo by opening the content via "View the Notebooks" or by exploring the following repo : https://github.com/databricks-demos (you can try to search for "merge" to see all the occurrences, for example) T...
- 1 kudos
- 1454 Views
- 2 replies
- 0 kudos
There is no certification number in my Databricks certificate that i had received after passing the
I enrolled myself for the Databricks data engineer certification recently and gave a shot at the exam and i did clear it successfully. I have received the certificate in the form of a pdf file along with a URL in which i can see my certificate and ba...
- 1454 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @vinay076 Thanks for asking! Our support team can provide you with a credential ID. Please file a ticket with our support team, give them your email associated with your certification, and they can get you the credential ID.
- 0 kudos
- 4851 Views
- 5 replies
- 4 kudos
Resolved! How obtain a list of workflows in Databricks?
I need to obtain a list of my Databricks workflows with their job IDs in a notebook Databricks
- 4851 Views
- 5 replies
- 4 kudos
- 4 kudos
Hi @VabethRamirez , Also, instead of using directly the API, you can use databricks Python sdk : %pip install databricks-sdk --upgrade dbutils.library.restartPython()from databricks.sdk import WorkspaceClient w = WorkspaceClient() job_list = w.jobs...
- 4 kudos
- 1230 Views
- 1 replies
- 0 kudos
Can api for query history /api/2.0/sql/history/queries return data which is older than 30 days?
I am using this api but it is returning the data for only last 30 days. Can this api return data which is older than 30 days?
- 1230 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @RahulChaubey, The query history system table was announced during the Q1 roadmap webinar (see the recording, 32:25). There is a chance that it will provide data with a horizon beyond 30 days. Meanwhile, you can enable system tables - I hope some ...
- 0 kudos
- 1742 Views
- 2 replies
- 0 kudos
Does Delta Table can be the source of streaming/auto loader?
Hi,Since the Auto Loader only accept "append-only" data as the source, I am wondering if the "Delta Table" can also be the source.Does VACCUM(deleting stale files) or _delta_log(creating nested and different file format than parquet) going to break A...
- 1742 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @QPeiran, Auto-loader is a feature that allows to integrate files into the Data Platform. Once your data is stored into the Delta Table, you can rely on spark.readStream.table("<my_table_name>") to continuously read from the table. Take a look at ...
- 0 kudos
- 840 Views
- 1 replies
- 0 kudos
Handling large volumes of streamed transactional data using DLT
We have a data stream from event hub with approximately 10 million rows per day (into one table) - these records are insert only (no update). We are trying to find a solution to aggregate / group by the data based on multiple data points and our requ...
- 840 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, please find below a set of resources I believe relevant for you. Success stories You can find the success stories of companies leveraging the streaming on Databricks here. Videos Introduction to Data Streaming on the Lakehouse : Structured Stream...
- 0 kudos
- 3032 Views
- 2 replies
- 0 kudos
Resolved! Rearrange tasks in databricks workflow
Hello,There is anyway to rearrange tasks in databricks workflow?.I would like that line that join the two marked tasks doesn't pass behind the other tasks. It is posible that this line by one side?Thanks.
- 3032 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @chemajar, Take a look at Databricks Asset Bundles. It allows you to streamline the development of complex workflows using a yaml definition. In case you need to change the task dependencies, you can rearrange the flow as you need just change the ...
- 0 kudos
- 1657 Views
- 2 replies
- 0 kudos
Do we pay just for qurery run duration while using databricks serverless sql ?
While using databricks serverless sql to run queries does we only pay for the compute resources during the run duration of the query ?
- 1657 Views
- 2 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 808 Views
- 1 replies
- 0 kudos
- 808 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, can you clatify what is your aim ? Maybe there is no need to use DB SDK at all ?
- 0 kudos
- 1936 Views
- 3 replies
- 0 kudos
Unity Catalog view access in Azure Storage account
Hi,I have my unity catalog in Azure Storage account and I can able to access table objects but I couldn't find my views that were created on top of those table. 1. I can can access Delta tables & related views via Databricks SQL and also find the tab...
- 1936 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi, Couple of options are possible : Use Databricks to do the complex SQL queries (joins, unions, etc) and write to a staging Delta Table. Then use DataFlow to read from that staged table. Orchestrate all of this using ADF or even Databricks Workflo...
- 0 kudos
- 1915 Views
- 2 replies
- 1 kudos
PowerBI Tips
Does anyone have any tips for using PowerBI on top of databricks? Any best practices you know of or roadblocks you have run into that should be avoided?Thanks.
- 1915 Views
- 2 replies
- 1 kudos
- 1 kudos
Hey, Use Partner Connect to establish a connection to PBI Consider to use Databricks SQL Serverless warehouses for the best user experience and performance (see Intelligent Workload Management aka auto-scaling and query queuing, remote result cache, ...
- 1 kudos
- 1028 Views
- 1 replies
- 0 kudos
Connecting to DataBricks Sql warehouse from .net
Hi,How can I connect to DataBricks sql warehouse from .net application. Kr
- 1028 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey, Please, take a look at Statement Execution API Best,
- 0 kudos
- 1116 Views
- 1 replies
- 1 kudos
Resolved! Can we get SQL Serverless warehouses monitoring data using APIs or logs ?
I am looking for a possible way to get the autoscaling history data for SQL Serverless Warehouses using API or logs.I want something like what we see in monitoring UI.  
- 1116 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi Rahul, you need to perform two actions : Enable system tables schema named "compute" (how-to, take a look on the page, it's highly possible that you'll find other schemas useful too)Explore system.compute.warehouse_events table Hope this helps. B...
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »