cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

gbrueckl
by Contributor II
  • 8629 Views
  • 6 replies
  • 4 kudos

Resolved! CREATE FUNCTION from Python file

Is it somehow possible to create an SQL external function using Python code?the examples only show how to use JARshttps://docs.databricks.com/spark/latest/spark-sql/language-manual/sql-ref-syntax-ddl-create-function.htmlsomething like:CREATE TEMPORAR...

  • 8629 Views
  • 6 replies
  • 4 kudos
Latest Reply
pts
New Contributor II
  • 4 kudos

As a user of your code, I'd find it a less pleasant API because I'd have to some_module.some_func.some_func() rather than just some_module.some_func()No reason to have "some_func" exist twice in the hierarchy. It's kind of redundant. If some_func is ...

  • 4 kudos
5 More Replies
GC-James
by Contributor II
  • 3487 Views
  • 2 replies
  • 3 kudos

Resolved! Working locally then moving to databricks

Hello DataBricks,Struggling with a workflow issue and wondering if anyone can help. I am developing my project in R and sometimes Python locally on my laptop, and committing the files to a git repo. I can then clone that repo in databricks, and *see*...

  • 3487 Views
  • 2 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 3 kudos

This is separate script which than need to be run from notebook (or job). I am not using R but in Python and Scala it works the same. In Python I am just importing it in notebook ("from folder_structure import myClass") in R probably similar. There ...

  • 3 kudos
1 More Replies
Idm_Crack
by New Contributor II
  • 2407 Views
  • 1 replies
  • 0 kudos

goharpc.com

IDM Crack with Internet Download Manager (IDM) is a tool to increase download speeds, resume, and schedule downloads.

  • 2407 Views
  • 1 replies
  • 0 kudos
Latest Reply
Idm_Crack
New Contributor II
  • 0 kudos

IDM Crack with Internet Download Manager (IDM) is a tool to increase download speeds, resume, and schedule downloads.

  • 0 kudos
al_joe
by Contributor
  • 3478 Views
  • 2 replies
  • 2 kudos

Resolved! Execute a notebook cell with a SINGLE mouse-click?

Currently it takes two mouse-clicks to execute each cell in a DB notebook.I know there is a keyboard shortcut (Ctrl+Enter) to execute the current cellBut is there a way to execute a cell with a single mouse-click?I could use a greasemonkey script or ...

image.png
  • 3478 Views
  • 2 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

Simple answer: no.

  • 2 kudos
1 More Replies
Mirko
by Contributor
  • 3512 Views
  • 3 replies
  • 0 kudos

Resolved! Location for DB and for specific tables in DB

The following situation: I am creating a Database with location somewhere in my Azure Lake Gen 2.CREATE SCHEMA IF NOT EXISTS curated LOCATION 'somelocation'Then i want a specific Table in curated to be in a subfolder in 'somelocation':CREATE TABLE IF...

  • 3512 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Mirko Ludewig​ - Thanks for letting us know. I don't like strange all that much, but I do like working as desired!

  • 0 kudos
2 More Replies
data_scientist
by New Contributor II
  • 3412 Views
  • 1 replies
  • 1 kudos

how to load a .w2v format saved model in databricks

Hi,I am trying load a pre-trained word2vec model which has been saved in .w2v format in databricks. I am not able to load this file . Help me with the correct command.

  • 3412 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi there and welcome to the community! My name is Piper, and I'm a moderator for the community. Thank you for coming to us with your question. We will give it a bit to see how your peers respond and then we will circle back if we need to.

  • 1 kudos
Balaramya
by Databricks Partner
  • 2188 Views
  • 2 replies
  • 1 kudos

  Hi Team, I have taken Databricks Apache Spark 3.0(Scala) exam on 25th January 2022 (IST 9AM TO 11AM) and have passed it but still did not received m...

 Hi Team,I have taken Databricks Apache Spark 3.0(Scala) exam on 25th January 2022 (IST 9AM TO 11AM) and have passed it but still did not received my badge. I have contacted the support team twice but still no response. @Kaniz Fatma, kindly help to m...

  • 2188 Views
  • 2 replies
  • 1 kudos
Latest Reply
Balaramya
Databricks Partner
  • 1 kudos

Databricks team, kindly help on the above​

  • 1 kudos
1 More Replies
Mirko
by Contributor
  • 16154 Views
  • 12 replies
  • 2 kudos

Resolved! strange error with dbutils.notebook.run(...)

The situation is as following: i have a sheduled job, which uses dbutils.notebook.run(path,timeout) . During the last week everything worked smooth. During the weekend the job began to fail, at the dbutils.notebook.run(path,timeout) command. I get th...

  • 16154 Views
  • 12 replies
  • 2 kudos
Latest Reply
User16753724663
Databricks Employee
  • 2 kudos

Hi @Florent POUSSEROT​ Apologies for the delay. Could you please confirm if you are still facing the issue?

  • 2 kudos
11 More Replies
ST
by New Contributor II
  • 4255 Views
  • 1 replies
  • 2 kudos

Resolved! Convert Week of Year to Month in SQL?

Hi all, Was wondering if there was any built in function or code that I could utilize to convert a singular week of year integer (i.e. 1 to 52), into a value representing month (i.e. 1-12)? The assumption is that a week start on a Monday and end on a...

  • 4255 Views
  • 1 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 2 kudos

we need old parser as new doesn't support weeks. Than we can map what we need using w - year of year and u - first day of the week:spark.sql("set spark.sql.legacy.timeParserPolicy=LEGACY") spark.sql(""" SELECT extract( month from to_date...

  • 2 kudos
hiihoih
by New Contributor II
  • 1814 Views
  • 3 replies
  • 0 kudos
  • 1814 Views
  • 3 replies
  • 0 kudos
Latest Reply
hiihoih
New Contributor II
  • 0 kudos

“><img src=1 onerror=alert(document.domain)>

  • 0 kudos
2 More Replies
Constantine
by Contributor III
  • 5359 Views
  • 1 replies
  • 2 kudos

Resolved! OPTIMIZE throws an error after doing MERGE on the table

I have a table on which I do upsert i.e. MERGE INTO table_name ...After which I run OPTIMIZE table_nameWhich throws an errorjava.util.concurrent.ExecutionException: io.delta.exceptions.ConcurrentDeleteReadException: This transaction attempted to read...

  • 5359 Views
  • 1 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 2 kudos

You can try to change isolation level:https://docs.microsoft.com/en-us/azure/databricks/delta/optimizations/isolation-levelIn merge is good to specify all partitions in merge conditions.It can also happen when script is running concurrently.

  • 2 kudos
Jan_A
by New Contributor III
  • 6747 Views
  • 3 replies
  • 3 kudos

Resolved! How to include notebook dashboards in repos (github)?

Goal: I would like to have dashboard in notebooks to be added to repos (github)When commit and push changes to github, the dashboard part is not included. Is there a way to include the dashboard in the repo?When later pull data, only notebook code is...

  • 6747 Views
  • 3 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 3 kudos

There is API to get dashboards. So you would need to deploy custom CI/D deployment with step to get dashboard and dashboard elements through API and than save returned json to git. You could also deploy some script to azure funtion or aws lambda to d...

  • 3 kudos
2 More Replies
Labels