cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

TinasheChinyati
by New Contributor
  • 12422 Views
  • 6 replies
  • 4 kudos

Is databricks capable of housing OLTP and OLAP?

Hi data experts.I currently have an OLTP (Azure SQL DB) that keeps data only for the past 14 days. We use Partition switching to achieve that and have an ETL (Azure data factory) process that feeds the Datawarehouse (Azure Synapse Analytics). My requ...

  • 12422 Views
  • 6 replies
  • 4 kudos
Latest Reply
Ben_dHont
New Contributor II
  • 4 kudos

@ChrisCkx and @bsanoopDatabricks is currently building a OLTP database functionality which is currently in private preview. It is a serverless PostgREST database. Documentation can be found here: [EXTERNAL] Online Tables REST - Private Preview Docume...

  • 4 kudos
5 More Replies
verargulla
by New Contributor III
  • 10603 Views
  • 3 replies
  • 4 kudos

Azure Databricks: Error Creating Cluster

We have provisioned a new workspace in Azure using our own VNet. Upon creating the first cluster, I encounter this error:Control Plane Request Failure: Failed to get instance bootstrap steps from the Databricks Control Plane. Please check that instan...

  • 10603 Views
  • 3 replies
  • 4 kudos
Latest Reply
ShaneOss
New Contributor II
  • 4 kudos

I'm also seeing this issue. Was there a solution?

  • 4 kudos
2 More Replies
DevGeek
by New Contributor
  • 266 Views
  • 1 replies
  • 0 kudos

Better Alternatives to ReadyAPI for API Testing?

I’m currently using ReadyAPI, mainly for API testing and some automation workflows, but I’m considering switching to something else. Has anyone here tried Apidog, Postman, or similar tools? I’m especially interested in how they compare in terms of pe...

  • 266 Views
  • 1 replies
  • 0 kudos
Latest Reply
Brahmareddy
Valued Contributor III
  • 0 kudos

Hi @DevGeek,How are you doing today?Consider trying Postman if you're looking for a robust tool with a wide range of features for API testing and automation. It’s known for its user-friendly interface and handles complex APIs and large datasets well,...

  • 0 kudos
dwalsh
by New Contributor III
  • 421 Views
  • 2 replies
  • 0 kudos

Resolved! Cannot run ./Includes/Classroom-Setup-01.1 in Advanced Data Engineering with Databricks with 12

I am started the Advanced Data Engineering with Databricks course and have tried to run the includes code at the start. We recently had issues with 12.2 and moved to a newer version as there appeared to be some issues around setuptools.  If I run "%r...

  • 421 Views
  • 2 replies
  • 0 kudos
Latest Reply
jainendrabrown
New Contributor II
  • 0 kudos

I am also having an issue. Running the first command itself. I am not sure how to download the Classroom-Setup data

  • 0 kudos
1 More Replies
MadelynM
by Databricks Employee
  • 7695 Views
  • 2 replies
  • 0 kudos

Delta Live Tables + S3 | 5 tips for cloud storage with DLT

You’ve gotten familiar with Delta Live Tables (DLT) via the quickstart and getting started guide. Now it’s time to tackle creating a DLT data pipeline for your cloud storage–with one line of code. Here’s how it’ll look when you're starting:CREATE OR ...

Workflows-Left Nav Workflows
  • 7695 Views
  • 2 replies
  • 0 kudos
Latest Reply
waynelxb
New Contributor II
  • 0 kudos

Hi MadelynM,How should we handle Source File Archival and Data Retention with DLT? Source File Archival: Once the data from source file is loaded with DLT Auto Loader, we want to move the source file from source folder to archival folder. How can we ...

  • 0 kudos
1 More Replies
rkshanmugaraja
by New Contributor
  • 221 Views
  • 1 replies
  • 0 kudos

Copy files and folder into Users area , but Files are not showing in UI

Hi AllI'm trying to copy the whole training directory (which contains multiple sub folders and files) from my catalog volume area to each users area. From : "dbfs:/Volumes/CatalogName/schema/Training"To : "dbfs:/Workspace/Users/username@domain.com/Tr...

  • 221 Views
  • 1 replies
  • 0 kudos
Latest Reply
radothede
Contributor II
  • 0 kudos

 Hi  ,Do You want to use dbfs location on purpose, or You want to upload the training notebooks to Workspace/users location? The reason I'm asking is those are two different locations, although both are related to file management in Databricks. (see:...

  • 0 kudos
slakshmanan
by New Contributor III
  • 321 Views
  • 1 replies
  • 0 kudos

how to view row produced from rest API in databricks for long running queries in Running state

   print(f"Query ID: {query['query_id']} ,Duration: {query['duration']} ms,user :{query['user_display_name']},Query_execute :{query['query_text']},Query_status : {query['status']},rw:{query['rows_produced']}"") U am able to get rows_produced only for...

  • 321 Views
  • 1 replies
  • 0 kudos
Latest Reply
slakshmanan
New Contributor III
  • 0 kudos

https://{databricks_instance}.cloud.databricks.com/api/2.0/sql/history/queries?include_metrics=true

  • 0 kudos
surajitDE
by New Contributor
  • 126 Views
  • 1 replies
  • 0 kudos

How to stop subsequent iterations in Databricks loop feature?

How to stop subsequent iterations in Databricks loop feature? sys.exit(0) or dbutils.notebook.exit() only marks the current task and series of tasks in sequence as failed, but continues with subsequent iterations.

  • 126 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Contributor III
  • 0 kudos

Hi @surajitDECurrently, there is no out of the box feature to achieve that. What you can do, is to try to implement notebook logic that in case of error will cancel for each task run using REST API or python sdk:- use /api/2.1/jobs/runs/cancel endpoi...

  • 0 kudos
Volker
by New Contributor III
  • 1239 Views
  • 4 replies
  • 1 kudos

Asset Bundles cannot run job with single node job cluster

Hello community,we are deploying a job using asset bundles and the job should run on a single node job cluster. Here is the DAB job definition:resources: jobs: example_job: name: example_job tasks: - task_key: main_task ...

  • 1239 Views
  • 4 replies
  • 1 kudos
Latest Reply
Volker
New Contributor III
  • 1 kudos

Sorry for the late reply, this helped, thank you! 

  • 1 kudos
3 More Replies
Ericsson
by New Contributor II
  • 2751 Views
  • 3 replies
  • 1 kudos

SQL week format issue its not showing result as 01(ww)

Hi Folks,I've requirement to show the week number as ww format. Please see the below codeselect weekofyear(date_add(to_date(current_date, 'yyyyMMdd'), +35)). also plz refre the screen shot for result.

result
  • 2751 Views
  • 3 replies
  • 1 kudos
Latest Reply
Feltonrolfson
New Contributor II
  • 1 kudos

It seems you're encountering an issue with SQL week formatting, where the results aren't displaying as expected (01(ww)). This could impact data analysis. monkey mart

  • 1 kudos
2 More Replies
SamAdams
by New Contributor III
  • 161 Views
  • 1 replies
  • 0 kudos

Redacted check constraint condition in Delta Table

Hello! I have a delta table with a check constraint - it's one of many that a config-driven ETL pipeline of mine generates. When someone edits the config file and deploys the change, I'd like for the check constraint to be updated as well if it's dif...

  • 161 Views
  • 1 replies
  • 0 kudos
Latest Reply
SamAdams
New Contributor III
  • 0 kudos

Figured this out with the help of @SamDataWalk 's post https://community.databricks.com/t5/data-engineering/databricks-bug-with-show-tblproperties-redacted-azure-databricks/m-p/93546It happens because Databricks thinks certain keywords in the constra...

  • 0 kudos
SamDataWalk
by New Contributor III
  • 1960 Views
  • 5 replies
  • 2 kudos

Resolved! Databricks bug with show tblproperties - redacted - Azure databricks

I am struggling to report what is a fairly fundamental bug. Can anyone help? Ideally someone from Databricks themselves. Or others who can confirm they can replicate it.There is a bug where databricks seems to be hiding “any” properties which have th...

  • 1960 Views
  • 5 replies
  • 2 kudos
Latest Reply
SamAdams
New Contributor III
  • 2 kudos

Like your example that redaction behavior seemed to pick up on the column name: a condition that included a column named "URL" was redacted, but one that included a "modifiedDateTime" was not

  • 2 kudos
4 More Replies
DBX123
by New Contributor
  • 191 Views
  • 1 replies
  • 1 kudos

Is it possible to have an alert when a row is added to a table?

I currently have a table that periodically adds rows in (sometimes daily, sometimes over a month). I was hoping to have an alert for when a row is added into this. The table has date fields of when rows are loaded in.I have an alert working, but it j...

  • 191 Views
  • 1 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

You could set your alert with query as: -- Custom alert to monitor for new rows added to a table SELECT COUNT(*) AS new_rows FROM your_table WHERE event_time > current_timestamp() - interval '1' hour In this example, the query checks for rows added t...

  • 1 kudos
EDDatabricks
by Contributor
  • 1340 Views
  • 2 replies
  • 1 kudos

How to enforce a cleanup policy on job cluster logs

We have a number of jobs on our databricks workspaces. All job clusters are configured with a dbfs location to save the respective logs (configured from Job cluster -> "Advanced options" -> "Logging").However, the logs are retained in the dbfs indefi...

  • 1340 Views
  • 2 replies
  • 1 kudos
Latest Reply
Atanu
Databricks Employee
  • 1 kudos

@EDDatabricks Thanks for reaching out to us. I think you should explore our Purge option is enabled or not. https://docs.databricks.com/en/administration-guide/workspace/settings/storage.html

  • 1 kudos
1 More Replies
dyusuf
by New Contributor
  • 1287 Views
  • 2 replies
  • 2 kudos

Unity Catalog

Can we set up unity catalog on databricks community edition?If yes, please share the process.Thanks

  • 1287 Views
  • 2 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Contributor III
  • 2 kudos

Another option you can try it to setup open source version of unity catalog with apache spark if you don't have possibility to create Azure trial account/

  • 2 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels