cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

eabouzeid
by New Contributor III
  • 8897 Views
  • 10 replies
  • 9 kudos

How to enable interactive Python matplotlib figures in DataBricks?

I want to make a matplolib interactive (I can zoom in/out, etc.) in databricks. This is achieved in Jupyter notebook by the following code: %matplotlib notebookHow to achieve this in databricks?Thank you

  • 8897 Views
  • 10 replies
  • 9 kudos
Latest Reply
amu
New Contributor II
  • 9 kudos

Hi there, while facing a similar issue we switched to Altair python library and it works great with Databricks. (other options can be Bokeh or Plotly).

  • 9 kudos
9 More Replies
brickster
by New Contributor II
  • 2406 Views
  • 3 replies
  • 0 kudos

How to trigger workflow job tasks from Autoloader

I have configured a File Notification Autoloader that monitors S3 bucket for binary files. I want to integrate autoloader with workflow job so that whenever a file is placed in S3 bucket, the pipeline job notebook tasks can pick-up new file and start...

  • 2406 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Saravanan Ponnaiah​ Hope everything is going great.Does @odoll odoll​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 0 kudos
2 More Replies
bradlindblad
by New Contributor II
  • 1376 Views
  • 2 replies
  • 1 kudos

Resolved! Font in Databricks Notebook is Greyed Out - Glitchy

The monospaced/code font in my databricks notebooks is greyed out, both in light and dark theme. I've tried playing with all the notebook settings, etc. and nothing will make the font 'normal'. I've tried Chrome and Edge, and the results are the same...

db
  • 1376 Views
  • 2 replies
  • 1 kudos
Latest Reply
klaapbakken
New Contributor III
  • 1 kudos

I was having this exact same issue. I fixed it by uninstalling the Source Code Pro font from my Windows machine.

  • 1 kudos
1 More Replies
Gk
by New Contributor III
  • 2281 Views
  • 10 replies
  • 1 kudos

DataBricks

How to find Mountpoints definitions

  • 2281 Views
  • 10 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Govardhana Reddy​ Glad to hear!Please mark the answer as best, it will be highly appreciable.Have a great day!Regards

  • 1 kudos
9 More Replies
sanjay
by Valued Contributor II
  • 1798 Views
  • 4 replies
  • 1 kudos

Resolved! How can I get date when autoloader processes the file

Hi,I am running autoloader which is running continuously and checks for new file every 1 minute. I need to store when file was received/processed but its giving me date when autoloader started. Here is my code.df = (spark   .readStream   .format("clo...

  • 1798 Views
  • 4 replies
  • 1 kudos
Latest Reply
Lakshay
Esteemed Contributor
  • 1 kudos

Hi @Sanjay Jain​ , You can use the File Metadata column functionality to collect that information.Ref doc:- https://docs.databricks.com/ingestion/file-metadata-column.html

  • 1 kudos
3 More Replies
u2dragon
by New Contributor III
  • 8695 Views
  • 5 replies
  • 0 kudos

Resolved! Can't install python library

I'm trying to install a python library but I'm not able, the status won't change from "pending". I get this message when I click on the library under the cluster's Libraries tab: "Library installation has been attempted on the driver node but has not...

  • 8695 Views
  • 5 replies
  • 0 kudos
Latest Reply
u2dragon
New Contributor III
  • 0 kudos

Ok, looks like I was able to solve my problem.First, I needed to install all the required libraries one by one. These are the followings:pandassixrequestspyspnegocryptographykrb5requests-kerberosAfter that I was able to install the webAPI library.

  • 0 kudos
4 More Replies
Jkb
by New Contributor II
  • 1769 Views
  • 2 replies
  • 2 kudos

Resolved! Workflow triggered by CLI shown "manually" triggered

We trigger different Worflows by ADF.These workflows will be shown triggered "manually".Is this behaviour intentional? At least for users, this is confusing.ADF-triggered Run: Databricks-Workflows: 

ADF_Monitor manually1 manually2
  • 1769 Views
  • 2 replies
  • 2 kudos
Latest Reply
Kaniz
Community Manager
  • 2 kudos

Hi @J. G.​, Thank you for your question! To assist you better, please take a moment to review the answer and let me know if it best fits your needs.Please help us select the best solution by clicking on "Select As Best" if it does. Your feedback will...

  • 2 kudos
1 More Replies
Merchiv
by New Contributor III
  • 9171 Views
  • 4 replies
  • 3 kudos

Resolved! How can I add a duration in milliseconds to a timestamp?

Let's say I have a DataFrame with a timestamp and an offset column in milliseconds respectively in the timestamp and long format. E.g.from datetime import datetime df = spark.createDataFrame( [ (datetime(2021, 1, 1), 1500, ), (dat...

  • 9171 Views
  • 4 replies
  • 3 kudos
Latest Reply
Merchiv
New Contributor III
  • 3 kudos

Although @Lakshay Goel​'s solution works, we've been using an alternative approach, that we found to be a bit more readable:from pyspark.sql import Column, functions as f     def make_dt_interval_sec(col: Column): return f.expr(f"make_dt_interval...

  • 3 kudos
3 More Replies
brickster_2018
by Esteemed Contributor
  • 8990 Views
  • 2 replies
  • 0 kudos
  • 8990 Views
  • 2 replies
  • 0 kudos
Latest Reply
Mooune_DBU
Valued Contributor
  • 0 kudos

It's set as an environment variable called `DATABRICKS_RUNTIME_VERSION`In your init scripts, you just need to add a line to display or save the info (see python example below):import os print("DATABRICKS_RUNTIME_VERSION:",os.environ.get('DATABRICKS_R...

  • 0 kudos
1 More Replies
bchaubey
by Contributor II
  • 1128 Views
  • 2 replies
  • 0 kudos

voucher

Did you receive your voucher?

  • 1128 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Kashish Khetarpaul​ Thank you for reaching out! Please submit a ticket to our Training Team here: https://help.databricks.com/s/contact-us?ReqType=training  and our team will get back to you shortly. 

  • 0 kudos
1 More Replies
Databrick_begin
by New Contributor
  • 1538 Views
  • 1 replies
  • 0 kudos

Databrick notebook to Azure SQL server connection using private ip because Public access is Denied in Azure SQL database, and Databrick and Azure SQL both in same subscription but different Virtual Network.

We have created private endpoint for Azure SQL database which has private ip. and by making host file entry in my system i am able to resolve Ip for Azure sql server from my system and connect to Server. but unable to connect from Azure Databrick not...

  • 1538 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ryoma
New Contributor II
  • 0 kudos

If vnet injection is not used, the connection could be established by setting up an init script with azure private resolver as nameserver.​#!/bin/bashmv /etc/resolv.conf /etc/resolv.conf.origecho nameserver <your dns server ip> | sudo tee --append /e...

  • 0 kudos
THIAM_HUATTAN
by Valued Contributor
  • 8509 Views
  • 6 replies
  • 7 kudos

Is catalog a feature in the community version?

%sql create catalog if not exists catalog1I tried above, but it gives me error as below:com.databricks.backend.common.rpc.DatabricksExceptions$SQLExecutionException: org.apache.spark.sql.AnalysisException: Catalog namespace is not supported. at com.d...

  • 8509 Views
  • 6 replies
  • 7 kudos
Latest Reply
Kaniz
Community Manager
  • 7 kudos

Hi @THIAM HUAT TAN​​, It would mean a lot if you could select the "Best Answer" to help others find the correct answer faster.This makes that answer appear right after the question, so it's easier to find within a thread.It also helps us mark the que...

  • 7 kudos
5 More Replies
Dataengineer_mm
by New Contributor
  • 1050 Views
  • 2 replies
  • 0 kudos

Databricks workflow migration to higher environments

How do we migrate the databricks workflows to higher environment ? I do see an option for calling the tasks (notebooks,python) from the github repositories. But as such how do we migrate the entire workflow jobs to other environment ?

  • 1050 Views
  • 2 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

Hi @Menaka Murugesan​,Just a friendly follow-up. Did any of the responses help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.

  • 0 kudos
1 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels