cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Smitha1
by Valued Contributor II
  • 2474 Views
  • 1 replies
  • 2 kudos

#00244807 and #00245872 Ticket Status - HIGH Priority

Dear @Vidula Khanna​ Vidula, Databricks team, @nadia Elsayed​ @Jose Gonzalez​ @Aden Jaxson​    What is the SLA/ETA for normal priority ticket and HIGH priority ticket?   I created tickets #00244807 on 7th Dec and  #00245872 but haven't received any u...

image.png
  • 2474 Views
  • 1 replies
  • 2 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 2 kudos

you can only create high-priority tasks if you have an enterprise plan.as a normal user you can only create normal tasksif you have enterprise plan then you can escalate case .databricks team will revert you soon there.

  • 2 kudos
john_odwyer
by Databricks Employee
  • 6206 Views
  • 1 replies
  • 1 kudos

Resolved! Masking A Data Column

Is there a way to mask the data in a column in a table from specific users or user groups?

  • 6206 Views
  • 1 replies
  • 1 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 1 kudos

yesthis doc will be helpful for you -- https://www.databricks.com/blog/2020/11/20/enforcing-column-level-encryption-and-avoiding-data-duplication-with-pii.html

  • 1 kudos
Mahendra1
by New Contributor III
  • 1353 Views
  • 1 replies
  • 0 kudos

Materials for preparing data bricks professional exam.

Hi All, Is there any book / materials for studying for data bricks professional certification ?Thank You !!!

  • 1353 Views
  • 1 replies
  • 0 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 0 kudos

please check databricks academy,there you will find the right courses

  • 0 kudos
183530
by New Contributor III
  • 1496 Views
  • 2 replies
  • 1 kudos

i need a regex to get whole word with parentheses

SELECT '(CC) ABC' REGEXP '\\b\\(CC\\)\\b' AS TEST1,    'A(CC) ABC' REGEXP '\\b\\(CC\\)\\b' AS TEST2,    'A (CC)A ABC' REGEXP '\\b\\(CC\\)\\b' AS TEST3,    'A (CC) A ABC' REGEXP '\\b\\(CC\\)\\b' AS TEST4,    'A ABC (CC)' REGEXP '\\b\\(CC\\)\\b' AS TES...

  • 1496 Views
  • 2 replies
  • 1 kudos
Latest Reply
183530
New Contributor III
  • 1 kudos

get whole word "(CC)"I had already written the outputexpected outuput '(CC) ABC' REGEXP <<regex>> = TRUE'A(CC) ABC' REGEXP <<regex>> = FALSE'A (CC)A ABC' REGEXP <<regex>> = FALSE 'A (CC) A ABC' REGEXP <<regex>> = TRUE 'A ABC (CC)' REGEXP <<regex>> = ...

  • 1 kudos
1 More Replies
akshay_1333
by New Contributor III
  • 1281 Views
  • 1 replies
  • 3 kudos

Note book formatting

I am using DBR 10.4 LTS instance can anyone help me formatting the code.I have tried with format python error pop up with upgrade to DBR 11.2 any other alternative to this? 

  • 1281 Views
  • 1 replies
  • 3 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 3 kudos

please give us a code by that we can help you

  • 3 kudos
Ossian
by New Contributor
  • 2837 Views
  • 1 replies
  • 0 kudos

Driver restarts and job dies after 10-20 hours (Structured Streaming)

I am running a java/jar Structured Streaming job on a single node cluster (Databricks runtime 8.3). The job contains a single query which reads records from multiple Azure Event Hubs using Spark Kafka functionality and outputs results to a mssql dat...

  • 2837 Views
  • 1 replies
  • 0 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 0 kudos

its seems that when your nodes are increasing it is seeking for init script and it is failing so you can use reserve instances for this activity instead of spot instances it will increase your overall costor alternatively, you can use depended librar...

  • 0 kudos
Pragat
by New Contributor
  • 1761 Views
  • 1 replies
  • 0 kudos

Databricks job parameterization

I am configuring an Databricks jobs using multiple notebooks having dependency with each other. All the notebooks are parameterized and using similiar parameters. How can i configure the parameter on global level so that all the notebooks can consume...

  • 1761 Views
  • 1 replies
  • 0 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 0 kudos

actually, it is very hard but if you want to use an alternative option you have to change your code and use a widget feature of data bricks.May be this is not a right option but you can still explore this doc for testing purpose https://docs.databric...

  • 0 kudos
Netty
by New Contributor III
  • 5726 Views
  • 1 replies
  • 2 kudos

What's the crontab notation for every other week for Databricks Workflow scheduling?

Hello,I need to schedule some of my jobs within Databricks Workflow every other week (or every 4 weeks). I've scoured a few forums for find what this notation would be, but I've been unfruitful in my searches.Is this scheduling possible in crontab? I...

  • 5726 Views
  • 1 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 2 kudos

For every seven days starting from Monday, you need to use 2/7. From my experience, that generator works best with databricks https://www.freeformatter.com/cron-expression-generator-quartz.html 

  • 2 kudos
Heman2
by Valued Contributor II
  • 3900 Views
  • 6 replies
  • 19 kudos

Can anyone let me know, Is there anyway In which we can access different workspace delta tables in a workspace where we run the pipelines using python...

Can anyone let me know, Is there anyway In which we can access different workspace delta tables in a workspace where we run the pipelines using python?​

  • 3900 Views
  • 6 replies
  • 19 kudos
Latest Reply
Harish2122
Contributor
  • 19 kudos

@Hemanth A​ go to the workspace you want data from, in warehouse tab you will find connectivity in that copy host name, http path and generate token for it, by this credentials you can access the data of this workspace in any other workspace.

  • 19 kudos
5 More Replies
dulu
by New Contributor III
  • 7795 Views
  • 3 replies
  • 15 kudos

Resolved! How to count the number of campaigns per day based on the start and end dates of the campaigns in SQL Spark Databrick

I need to count the number of campaigns per day based on the start and end dates of the campaignsInput Table: Out needed (result):How do I need to write the SQL command in databricks to get the above result? thanks all

image 1 image 2
  • 7795 Views
  • 3 replies
  • 15 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 15 kudos

Just create an array with sequence, explode it, and then group and count:WITH cte AS (SELECT `campaign name`, explode(sequence(`Start date`, `End date`, interval 1 day)) as `Date` FROM `campaigns`) SELECT Count(`campaign name`) as `count uni...

  • 15 kudos
2 More Replies
183530
by New Contributor III
  • 824 Views
  • 0 replies
  • 1 kudos

Needed a regex to (CC)

SELECT '(CC) ABC' REGEXP '\\b\\(CC\\)\\b' AS TEST1,    'A(CC) ABC' REGEXP '\\b\\(CC\\)\\b' AS TEST2,    'A (CC)A ABC' REGEXP '\\b\\(CC\\)\\b' AS TEST3,    'A (CC) A ABC' REGEXP '\\b\\(CC\\)\\b' AS TEST4,    'A ABC (CC)' REGEXP '\\b\\(CC\\)\\b' AS TES...

  • 824 Views
  • 0 replies
  • 1 kudos
seberino
by New Contributor III
  • 1762 Views
  • 0 replies
  • 1 kudos

How revoke SELECT permissions on a table in Data Explorer when it only lets me revoke new explicit grants I've added myself?

I'm able to make it to the Permission page of the schema and table I'm trying to do access control on within the Data Explorer page.At first you can only grant permissions but not revoke anything. Only after you have made new grants can you revoke w...

  • 1762 Views
  • 0 replies
  • 1 kudos
Labels