cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

vyasakhilesh
by New Contributor
  • 431 Views
  • 1 replies
  • 0 kudos

Error Creating Table from delta location dbfs

[UC_FILE_SCHEME_FOR_TABLE_CREATION_NOT_SUPPORTED] Creating table in Unity Catalog with file scheme dbfs is not supported. Instead, please create a federated data source connection using the CREATE CONNECTION command for the same table provider, then ...

  • 431 Views
  • 1 replies
  • 0 kudos
Latest Reply
agallard2
New Contributor III
  • 0 kudos

Hi @vyasakhilesh,The error you're seeing, [UC_FILE_SCHEME_FOR_TABLE_CREATION_NOT_SUPPORTED], occurs because Unity Catalog in Databricks does not support creating tables directly from DBFS (Databricks File System) locations.In this case, you're trying...

  • 0 kudos
SethParker
by New Contributor III
  • 812 Views
  • 6 replies
  • 0 kudos

Resolved! SQL View Formatting in Catalog - Can you turn it off?

It appears as though Databricks now formats SQL View definitions when showing them in the Catalog.  Our solution is based on views, and we have comment tags in those views.  We format these views so that it is easy for us to find and update parts of ...

SethParker_0-1730757882698.png SethParker_1-1730757945105.png
  • 812 Views
  • 6 replies
  • 0 kudos
Latest Reply
SethParker
New Contributor III
  • 0 kudos

Thank you!  I will submit that request.In case anyone else stumbles upon this post, here is a function you can add that will return the view definition from information_schema, unformatted with an ALTER statement at the top:DROP FUNCTION IF EXISTS <c...

  • 0 kudos
5 More Replies
John_Rotenstein
by New Contributor II
  • 5426 Views
  • 7 replies
  • 1 kudos

ODBC on Windows -- Where to specify Catalog name?

We are attempting to connect a Windows ODBC application to Unity Catalog.The Configure the Databricks ODBC and JDBC drivers documentation has a section titled "ODBC configuration and connection parameters" that mentions a configuration parameter call...

  • 5426 Views
  • 7 replies
  • 1 kudos
Latest Reply
PiotrU
Contributor II
  • 1 kudos

It's quite interesting - I am using Mac, Simba spark ODBC 2.8.2 - and, If I will not add "Catalog" parameter - in UI, I will only see default one (If I will have access to it) - that doesn't mean I cannot query other one, It's just not listed in the ...

  • 1 kudos
6 More Replies
ws4100e
by New Contributor III
  • 5777 Views
  • 9 replies
  • 0 kudos

DLT piplines with UC

I try to run a (very simple) DLT pipeline in with a resulting materialized table is published in UC schema with a managed storage location defined (within an existing EXTERNAL LOCATION). Accoding to the documentation: Publishing to schemas that speci...

  • 5777 Views
  • 9 replies
  • 0 kudos
Latest Reply
ImranA
Contributor
  • 0 kudos

Difference between the Preview and Current Channel?

  • 0 kudos
8 More Replies
maranBH
by New Contributor III
  • 27495 Views
  • 5 replies
  • 11 kudos

Resolved! How to import a function to another notebook using Repos without %run?

Hi all,I was reading the Repos documentation: https://docs.databricks.com/repos.html#migrate-from-run-commandsIt is explained that, one advantage of Repos is no longer necessary to use %run magic command to make funcions available in one notebook to ...

  • 27495 Views
  • 5 replies
  • 11 kudos
Latest Reply
JakubSkibicki
Contributor
  • 11 kudos

Due to new functionalies in Runtime 16.0 regarding autoload i came across this autoload. Performaed a practical test. It works. However had some problems at first.As in solution the key was that definitions are places in a file.py not a notebook.

  • 11 kudos
4 More Replies
dtabass
by New Contributor III
  • 61483 Views
  • 5 replies
  • 6 kudos

How/where can I see a list of my dbfs files?

When using the Community Edition, I'm trying to find a place in the UI where I can browse the files that I've uploaded to dbfs. How/where can I do that? When I try to view them from the Data sidebar I see nothing, yet I know they're there, as if I us...

  • 61483 Views
  • 5 replies
  • 6 kudos
Latest Reply
vigneshmayil
New Contributor II
  • 6 kudos

1. Enable it from the settings->advanced-> dbfs browser 2. Refresh the page3. You can browse by clicking Catalog->Browse DBFS  

  • 6 kudos
4 More Replies
lakshgisprog
by New Contributor II
  • 1057 Views
  • 3 replies
  • 0 kudos

Create a simple Geospatial Table with Geography type column

Hello AllI am looking for guidance on creating an simple US states table with shape as Geography column type. I do not want to use Apache Sedona (due to cluster limitations). I am going to create an Node JS application which is going to query this ge...

  • 1057 Views
  • 3 replies
  • 0 kudos
Latest Reply
lakshgisprog
New Contributor II
  • 0 kudos

Thank you for prompt response. Yes, I have gone through the blog. I followed the same process ; for examples, buildings table have geometry which stores as binary type. My question, how to store geometry in 'Geography' type.  The GEOGRAPHY data type ...

  • 0 kudos
2 More Replies
zed
by New Contributor III
  • 988 Views
  • 5 replies
  • 0 kudos

Can't pass dynamic parameters to non-notebook Python job (spark_python_task)

I need to access the date of a given job running as a non-notebook Python job (spark_python_task). I want to pass a value from the cli when running it and being available to access the value in the scriptI tried the approaches in the attached image w...

  • 988 Views
  • 5 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Can you confirm if this solution applies to you https://community.databricks.com/t5/data-engineering/retrieve-job-level-parameters-in-spark-python-task-not-notebooks/td-p/75324 ?

  • 0 kudos
4 More Replies
KFries
by New Contributor II
  • 1865 Views
  • 2 replies
  • 2 kudos

SQL Notebook Tab Spacing

My SQL notebooks in databricks suffer from having at least several different counts of spaces between tab marks.  It makes it very difficult to maintain pretty code spacing.  What sets the tab spacing in SQL language notebooks, and how is it set/adju...

  • 1865 Views
  • 2 replies
  • 2 kudos
Latest Reply
louisv-bambaw
New Contributor II
  • 2 kudos

I’m experiencing the same issue with SQL cell indentation in Databricks notebooks. While editing, I’ve noticed that the indentation level can vary from one cell to another - sometimes it’s two spaces, other times it’s four. This inconsistency makes i...

  • 2 kudos
1 More Replies
597581
by New Contributor III
  • 3812 Views
  • 22 replies
  • 26 kudos

Resolved! Run selected text shortcut not working

The keyboard shortcut to run selected text (ctrl + shift + enter) has not been working for me since yesterday (10/31/24). Instead of running the selected text, databricks notebooks are treating it like shift + enter and running the entire cell. I hav...

  • 3812 Views
  • 22 replies
  • 26 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 26 kudos

Folks, could you please double-check now, issue should be fixed now. Thanks!

  • 26 kudos
21 More Replies
AtulMathur
by New Contributor III
  • 846 Views
  • 2 replies
  • 1 kudos

Resolved! Comparing two SQL notebooks from different Environments

Hello Everyone,I am part of data testing team which is responsible to verify data trends and insights generated from different sources. There are multiple schemas and tables in our platform. We use SQL queries in notebooks to verify all enrichment, m...

  • 846 Views
  • 2 replies
  • 1 kudos
Latest Reply
AtulMathur
New Contributor III
  • 1 kudos

Thank you Walter. I did thought about it doing it one by one but then it was not coming out to be very efficient way. I  then found a way to do it in Python via iterating through a dataframe of table names.

  • 1 kudos
1 More Replies
OliverCadman
by New Contributor III
  • 18686 Views
  • 7 replies
  • 6 kudos

'File not found' error when executing %run magic command

I'm just walking through a simple exercise presented in the Databricks Platform Lab notebook, in which I'm executing a remote notebook from within using the %run command. The remote notebook resides in the same directory as the Platform Lab notebook,...

Data Engineering
%file_not_found
%magic_commands
%run
  • 18686 Views
  • 7 replies
  • 6 kudos
Latest Reply
ArturOA
New Contributor III
  • 6 kudos

I have seen this error popping up when you define a Python file without the header: # Databricks notebook source Databricks has a hard time running the file as a nested notebook and you can get some weird errors.   

  • 6 kudos
6 More Replies
Brad
by Contributor II
  • 624 Views
  • 3 replies
  • 0 kudos

What is "ExecuteGrpcResponseSender: Deadline reached, shutting down stream"

 Hi, I have a delta table which is loaded by structured streaming job. When I tried to read this delta table and do a MERGE with foreachBatch, I found sometimes there is a big interval between streaming starts and MERGE starting to run and seems spar...

  • 624 Views
  • 3 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

It may not necessarily be a bug, but some tuning due to architectural differences. What the message says is: The system was processing a gRPC operation identified by opId=5ef071b7-xxx, and it set a deadline for that operation (likely 120 seconds).The...

  • 0 kudos
2 More Replies
dtb_usr
by New Contributor II
  • 1881 Views
  • 1 replies
  • 1 kudos

Creating a private connection with Google Sheets

How do I ingest sensitive data from a googlesheet doc to Databricks unity catalogue without making the googlesheet public.

  • 1881 Views
  • 1 replies
  • 1 kudos
Latest Reply
agallard2
New Contributor III
  • 1 kudos

Hi @dtb_usr,You can share the Google Sheet with the Service Account and use Google Sheets API ClientOpen the Google Sheet you want to access.Click on Share and add the email address of the service account (it will look something like your-service-acc...

  • 1 kudos
radix
by New Contributor II
  • 516 Views
  • 1 replies
  • 0 kudos

Databricks cluster pools with init scripts

Ability to submit a single job with cluster pools and init scriptsfor the following payload:  { "run_name": "A multitask job run", "timeout_seconds": 86400, "tasks": [ { "task_key": "task_1", "depends_on": ...

  • 516 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Are you still facing issues with the job run submit API endpoint?

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels