cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

LanceYoung
by New Contributor III
  • 7293 Views
  • 7 replies
  • 6 kudos

Resolved! Unable to make Databricks API calls from an HTML iframe rendered by a notebook's `displayHTML()` call, due to the browser enforcing CORS policy.

My GoalI want to make my Databricks Notebooks more interactive and have custom HTML/JS UI widgets that guide non-technical people through a business/data process. I want the HTML/JS widget to be able to execute a DB job, or execute some python code t...

  • 7293 Views
  • 7 replies
  • 6 kudos
Latest Reply
Kaniz
Community Manager
  • 6 kudos

Hi @Lance Young​ , Just a friendly follow-up. Do you still need help, or have you resolved your problem using the above solutions? Please let us know.

  • 6 kudos
6 More Replies
Emiel_Smeenk
by New Contributor III
  • 9350 Views
  • 11 replies
  • 9 kudos

Resolved! Databricks Runtime 10.4 LTS - AnalysisException: No such struct field id in 0, 1 after upgrading

Hello,We are working to migrate to databricks runtime 10.4 LTS from 9.1 LTS but we're running into weird behavioral issues. Our existing code works up until runtime 10.3 and in 10.4 it stopped working.Problem:We have a nested json file that we are fl...

image image image
  • 9350 Views
  • 11 replies
  • 9 kudos
Latest Reply
Kaniz
Community Manager
  • 9 kudos

Hi @Nirupam Nishant​ , Just a friendly follow-up. Do you still need help, or does my response help you to find the solution? Please let us know.

  • 9 kudos
10 More Replies
BasavarajAngadi
by Contributor
  • 3410 Views
  • 12 replies
  • 9 kudos

Resolved! Hi Experts i am new to data bricks and i want to know how data bricks supports real time reporting needs in Business intelligence?

Delta lake have 3 levels to maintain data quality ( bronze , silver and gold tables ) but this supports the reporting and BI solutions how does this supports the streaming analytics ?example : I have an app that loads all the operational data in adls...

  • 3410 Views
  • 12 replies
  • 9 kudos
Latest Reply
Kaniz
Community Manager
  • 9 kudos

@Basavaraj Angadi​ , Just a friendly follow-up. Do you still need help, or @Werner Stinckens​ 's response help you to find the solution? Please let us know.

  • 9 kudos
11 More Replies
BasavarajAngadi
by Contributor
  • 1180 Views
  • 3 replies
  • 4 kudos

Resolved! Hi Experts : I am new to Databricks please help me on below. Question : How is delta table stored in DBFS ?

If I create delta table the table is stored in parque format in DBFS location ? and please share how the parque files supports schema evolution if i do DML operation.As per my understanding : we read data from data lake first in data frame and try to...

  • 1180 Views
  • 3 replies
  • 4 kudos
Latest Reply
Kaniz
Community Manager
  • 4 kudos

Hi @Werner Stinckens​ , Thank you so much for your contribution to our Community.

  • 4 kudos
2 More Replies
nickg
by New Contributor III
  • 2742 Views
  • 7 replies
  • 3 kudos

Resolved! I am looking to use the pivot function with Spark SQL (not Python)

Hello. I am trying to using the Pivot function for email addresses. This is what I have so far:Select fname, lname, awUniqueID, Email1, Email2From xxxxxxxxPivot (    count(Email) as Test    For Email    In (1 as Email1, 2 as Email2)    )I get everyth...

  • 2742 Views
  • 7 replies
  • 3 kudos
Latest Reply
nickg
New Contributor III
  • 3 kudos

source data:fname lname awUniqueID EmailJohn Smith 22 jsmith@gmail.comJODI JONES 22 jsmith@live.comDesired output:fname lname awUniqueID Em...

  • 3 kudos
6 More Replies
HarshaK
by New Contributor III
  • 8508 Views
  • 4 replies
  • 6 kudos

Resolved! Partition By () on Delta Files

Hi All,I am trying to Partition By () on Delta file in pyspark language and using command:df.write.format("delta").mode("overwrite").option("overwriteSchema","true").partitionBy("Partition Column").save("Partition file path") -- It doesnt seems to w...

  • 8508 Views
  • 4 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hey @Harsha kriplani​ Hope you are well. Thank you for posting in here. It is awesome that you found a solution. Would you like to mark Hubert's answer as best?  It would be really helpful for the other members too.Cheers!

  • 6 kudos
3 More Replies
sannycse
by New Contributor II
  • 2656 Views
  • 6 replies
  • 6 kudos

Resolved! read the csv file as shown in description

Project_Details.csvProjectNo|ProjectName|EmployeeNo100|analytics|1100|analytics|2101|machine learning|3101|machine learning|1101|machine learning|4Find each employee in the form of list working on each project?Output:ProjectNo|employeeNo100|[1,2]101|...

  • 2656 Views
  • 6 replies
  • 6 kudos
Latest Reply
Kaniz
Community Manager
  • 6 kudos

Hi @SANJEEV BANDRU​ , Just a friendly follow-up. Do you still need help? Please let us know.

  • 6 kudos
5 More Replies
Serhii
by Contributor
  • 2735 Views
  • 5 replies
  • 8 kudos

Resolved! init_script error during cluster creation - 101: Network is unreachable

When I run the init_script during cluster creationapt-get update && apt-get install -y ffmpeg libsndfile1-devI get an error in cluster logs E: Failed to fetch http://archive.ubuntu.com/ubuntu/pool/universe/o/openal-soft/libopenal1_1.19.1-1_amd64.deb ...

  • 2735 Views
  • 5 replies
  • 8 kudos
Latest Reply
Kaniz
Community Manager
  • 8 kudos

Hi @Sergii Ivakhno​ , Just a friendly follow-up. Do you still need help, or @Pratik Bhawsar​ 's response help you to find the solution? Please let us know.

  • 8 kudos
4 More Replies
Manoj
by Contributor II
  • 967 Views
  • 3 replies
  • 6 kudos

Resolved! Does job cluster helps the jobs that are fighting for Resources on all purpose cluster ?

Hi Team, Does job cluster helps the jobs that are fighting for Resources on all purpose cluster ?With job cluster the drawback that i see is creation of cluster every time when the job starts, Its taking 2 mins for spinning up the cluster. Instead of...

  • 967 Views
  • 3 replies
  • 6 kudos
Latest Reply
Kaniz
Community Manager
  • 6 kudos

Hi @Manoj Kumar Rayalla​ , Just a friendly follow-up. Do you still need help, or @Hubert Dudek (Customer)​ 's response help you to find the solution? Please let us know.

  • 6 kudos
2 More Replies
yoniau
by New Contributor II
  • 1871 Views
  • 3 replies
  • 5 kudos

Resolved! Different configurations for same Databricks Runtime version

Hi all,On my DBR installations, s3a scheme is mapped to shaded.databricks.org.apache.hadoop.fs.s3a.S3AFileSystem. On my customer's DBR installations it is mapped to com.databricks.s3a.S3AFileSystem.We both use the same DBR runtime, and none of us has...

  • 1871 Views
  • 3 replies
  • 5 kudos
Latest Reply
Kaniz
Community Manager
  • 5 kudos

Hi @Yoni Au​ , Just a friendly follow-up. Do you still need help, or do @Hubert Dudek (Customer)​ and @Prabakar Ammeappin​ 's responses help you find the solution? Please let us know.

  • 5 kudos
2 More Replies
greyfine
by New Contributor II
  • 5798 Views
  • 4 replies
  • 7 kudos

Resolved! Hi Everyone , I was wondering if it is possible to have alerts set up on query level for pyspark notebooks that are run on schedule in databricks so if we have some expected result from it we can receive a mail alert ?

In Above you can see we have 3 workspaces - we have the alert option available in the sql workspace but not in our data science and engineering space , anyway we can incorporate this in our DS and Engineering space ?

image.png
  • 5798 Views
  • 4 replies
  • 7 kudos
Latest Reply
Anonymous
Not applicable
  • 7 kudos

Hey there @Atul Vaid​ Thank you for posting your question. Were you able to find a solution from the answers above? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.

  • 7 kudos
3 More Replies
LorenRD
by Contributor
  • 5661 Views
  • 11 replies
  • 13 kudos

Resolved! Is it possible to connect Databricks SQL with AWS Redshift DB?

I would like to know if it's possible to connect Databricks SQL module with not just internal Metastore DB and tables from Data Science and Engineering module but also connect with an AWS Redshift DB to do queries and create alerts. 

image
  • 5661 Views
  • 11 replies
  • 13 kudos
Latest Reply
LorenRD
Contributor
  • 13 kudos

Hi @Kaniz Fatma​ I contacted Customer support explaining this issue, they told me that this feature is not implemented yet but it's in the roadmap with no ETA. It would be great if you ping me back when it's possible to access Redshift tables from SQ...

  • 13 kudos
10 More Replies
AmanSehgal
by Honored Contributor III
  • 1850 Views
  • 4 replies
  • 10 kudos

Migrating data from delta lake to RDS MySQL and ElasticSearch

There are mechanisms (like DMS) to get data from RDS to delta lake and store the data in parquet format, but is it possible to reverse of this in AWS?I want to send data from data lake to MySQL RDS tables in batch mode.And the next step is to send th...

  • 1850 Views
  • 4 replies
  • 10 kudos
Latest Reply
AmanSehgal
Honored Contributor III
  • 10 kudos

@Kaniz Fatma​  and @Hubert Dudek​  - writing to MySQL RDS is relatively simpler. I'm finding ways to export data into Elasticsearch

  • 10 kudos
3 More Replies
Michael_Galli
by Contributor II
  • 7164 Views
  • 7 replies
  • 8 kudos

Resolved! Monitoring Azure Databricks in an Azure Log Analytics Workspace

Does anyone have experience with the mspnp/spark-monitoring library ?Is this best practice, or are there better ways to monitor a Databricks Cluster?

  • 7164 Views
  • 7 replies
  • 8 kudos
Latest Reply
User16764241763
Honored Contributor
  • 8 kudos

@Michael Galli​  I don't think you can monitor metrics captured by mspnp/spark-monitoring in datadog, there is a service called Azure Log Analytics workspace where these logs are available for querying.You can also check out below if you are interest...

  • 8 kudos
6 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels