- 6 Views
- 0 replies
- 0 kudos
Input SQL Script (assume any dialect) : SELECT b.se10,
b.se3,
b.se_aggrtr_indctr,
b.key_swipe_ind
FROM
(SELECT se10,
se3,
se_aggrtr_indctr,
ROW_NUMBER() OVER (PARTITION BY SE10
...
- 6 Views
- 0 replies
- 0 kudos
by
SreeG
• New Contributor II
- 291 Views
- 3 replies
- 0 kudos
HiI am facing issues when deploying work flows to different environment. The same works for Notebooks and Scripts, when deploying the work flows, it failed with "Authorization Failed. Your token may be expired or lack the valid scope". Anything shoul...
- 291 Views
- 3 replies
- 0 kudos
- 6 Views
- 0 replies
- 0 kudos
We are a data consultancy. Our free trial period is currently getting over and we are still doing POC for one of our potential clients and focusing on providing expert services around databricks.1. Is there a possibility that we can extend the free t...
- 6 Views
- 0 replies
- 0 kudos
- 40 Views
- 1 replies
- 0 kudos
Hi,We have a DLT pipeline that has been running for a while with a Hive Metastore target that has stored billions of records. We'd like to move the data to a Unity Catalog. The documentation says "Existing pipelines that use the Hive metastore cannot...
- 40 Views
- 1 replies
- 0 kudos
Latest Reply
@MarkD good day!
I'm sorry, but according to the description, existing pipelines using the Hive metastore cannot be upgraded to use Unity Catalog. To migrate an existing pipeline that writes to Hive metastore, you must create a new pipeline and re-in...
- 2106 Views
- 4 replies
- 3 kudos
I am currently working with a VNET injected databricks workspace. At the moment I have mounted a the databricks cluster on an ADLS G2 resource. When running notebooks on a single node that read, transform, and write data we do not encounter any probl...
- 2106 Views
- 4 replies
- 3 kudos
Latest Reply
@TheDataDexter Did you find a solution to your problem? I am facing the same issue
3 More Replies
- 338 Views
- 8 replies
- 0 kudos
I need help with migrating from dbfs on databricks to workspace. I am new to databricks and am struggling with what is on the links provided.My workspace.yml also has dbfs hard-coded. Included is a full deployment with great expectations.This was don...
- 338 Views
- 8 replies
- 0 kudos
Latest Reply
One of the other suggestions is to use Lakehouse Federation. It is possible it may be a driver issue (we will get to know from the logs)
7 More Replies
- 2035 Views
- 3 replies
- 0 kudos
Hello,I have currently a delta folder as a table with several columns that are nullable. I want to migrate data to the table and overwrite the content using Pyspark, add several new columns and make them not nullable. I have found a way to make the c...
- 2035 Views
- 3 replies
- 0 kudos
Latest Reply
Not sure if you found a solution, you can also try as below. In this case you pass the full path to the delta not the table itself.spark.sql(f"ALTER TABLE delta.`{full_delta_path}` ALTER column {column_name} SET NOT NULL")
2 More Replies
- 75 Views
- 1 replies
- 1 kudos
Is delta live tables/pipelines support oracle or external database connectivity ? i am getting Oracle Driver not found error. dlt not supporting maven install through asset bundles. ERRORs: 1) py4j.protocol.Py4JJavaError: An error occurred while call...
- 75 Views
- 1 replies
- 1 kudos
Latest Reply
Hi @venkata_kishore ,
As of now, DLT does not support Oracle, and one cannot install third-party libraries and JARs.
https://docs.databricks.com/en/delta-live-tables/unity-catalog.html#limitationsIf Lakehouse Federation has support for Oracle, then ...
by
Phani1
• Valued Contributor
- 29 Views
- 0 replies
- 0 kudos
Hi Team,Can you provide information about topology, centralized, and federated workspaces in databricks and how they are used?Regards,Janga
- 29 Views
- 0 replies
- 0 kudos
- 77 Views
- 0 replies
- 1 kudos
Hello,We are using delta live tables to ingest data from multiple business groups, each with different input file formats and parsing requirements. The input files are ingested from azure blob storage. Right now, we are only servicing three busines...
- 77 Views
- 0 replies
- 1 kudos
- 455 Views
- 5 replies
- 4 kudos
When running my notebook using personal compute with instance profile I am indeed able to readStream from kinesis. But adding it as a DLT with UC, while specifying the same instance-profile in the DLT pipeline setting - causes a "MissingAuthenticatio...
- 455 Views
- 5 replies
- 4 kudos
Latest Reply
@Mathias_Peters , Thanks for the details. Curious how make the roleAan part work , we are able to make it work only with passing accessKey and Secret key, not with roleArn. if you are using SQL based DLT tables , Could you please share some code samp...
4 More Replies
- 581 Views
- 1 replies
- 0 kudos
connect to databricks unity catalog and meet this error java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500540) Error caught in BackgroundFetcher. Foreground thread ID: 59. Background thread ID: 61. Error caught: null. at com.databricks.cli...
- 581 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @dprutean, Thank you for providing the details about the error you’re encountering while connecting to the Databricks Unity Catalog using the Databricks JDBC driver.
Let’s troubleshoot this step by step:
Check your connection string:
The conn...
- 130 Views
- 2 replies
- 0 kudos
Hi Databricks community,Hope you are doing well.I am trying to create an external table using a Gzipped CSV file uploaded to an S3 bucket.The S3 URI of the resource doesn't have any file extensions, but the content of the file is a Gzipped comma sepa...
- 130 Views
- 2 replies
- 0 kudos
Latest Reply
Hey , thanks for your response. I tried using a Serde(I think the OpenCSVSerde should work for me) but unfortunately im getting the below from the Unity Catalog:[UC_DATASOURCE_NOT_SUPPORTED] Data source format hive is not supported in Unity Catalog....
1 More Replies
- 240 Views
- 1 replies
- 0 kudos
Just finished the final day of training. Great content and delivery!
- 240 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @KrishnaK135, That's wonderful to hear! We're thrilled that you found the content and delivery of the training at DAIS 2023 to be excellent. Your positive feedback means a lot to us!
We also wanted to share some exciting news with you all. The Dat...
- 4457 Views
- 7 replies
- 0 kudos
Hi Team,How to write recrusive cte in databricks SQL.Please let me know any one have solution for this
- 4457 Views
- 7 replies
- 0 kudos
Latest Reply
Weirdly, if you ask the Databricks assistant, it tells you it does support recursive CTE and will give you sample code for it. If you follow up and press it on details it doubles down and says it supports it, but tell you the runtime version informat...
6 More Replies