cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Science & Machine Learning

Forum Posts

Direo
by Contributor
  • 1886 Views
  • 3 replies
  • 2 kudos

Feature store feature table location

Can Databricks feature tables be stored outside of DBFS?

  • 1886 Views
  • 3 replies
  • 2 kudos
Latest Reply
the-sab
New Contributor II
  • 2 kudos

Yes, Databricks feature tables can be stored outside of Databricks File System (DBFS). You can store your feature tables in external storage systems such as Amazon S3, Azure Blob Storage, Azure Data Lake Storage, or Hadoop Distributed File System (HD...

  • 2 kudos
2 More Replies
LLMwithML
by New Contributor II
  • 437 Views
  • 0 replies
  • 1 kudos

DATA AI SUMMIT

The Databricks Data + AI Summit 2023 has been great so far. I just completed the two day Data Management Training, where i learned a lot of practical tips on making my piepelines more efficient and Robust.After these two days sessions I got a good id...

  • 437 Views
  • 0 replies
  • 1 kudos
kll
by New Contributor III
  • 5891 Views
  • 1 replies
  • 1 kudos

Internal error: com.databricks.rpc.RPCResponseTooLarge, when attempting to use mosaic's st_intersects

I get an exception when attempting to run the following line of code, which filters a spark DataFrame based on the geometry.df_tx = df_zip.filter(st_intersects(st_aswkt("zip_code_geom"), tx_poly))   df_tx.show()where, `tx_poly` is,`tx_poly = shapely....

  • 5891 Views
  • 1 replies
  • 1 kudos
Latest Reply
WernerS
New Contributor III
  • 1 kudos

I am not familiar with st_intersects, but it seems that it runs solely on the driver (as python code, not spark).Does mosaic work in pyspark?If not: try to use a larger driver.

  • 1 kudos
mbaumga
by New Contributor III
  • 2506 Views
  • 7 replies
  • 9 kudos

How to request the addition of pre-installed R packages on the clusters?

Today, many R packages are pre-installed on the standard clusters on Databricks. Libraries like "tidyverse", "ggplot2", etc are there. Also the great library "readxl" to load Excel files. But unfortunately, its counterpart "writexl" is not pre-instal...

  • 2506 Views
  • 7 replies
  • 9 kudos
Latest Reply
wicckkjoe
New Contributor II
  • 9 kudos

I just need to figure who decides which R packages are pre-installed on the cluster.

  • 9 kudos
6 More Replies
sher
by Valued Contributor II
  • 925 Views
  • 1 replies
  • 1 kudos

how to execute the CHANGE_TRACKING snowflake query in databricks

please check the attached image which needs to resolve. anyone have come across this kind of issues

image (1)
  • 925 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @sherbin w​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 1 kudos
rahullalwani25
by New Contributor II
  • 2882 Views
  • 4 replies
  • 5 kudos

to_utc_timestamp is subtracting a different time delta and from_utc_timestamp is not adding the same delta.

My session timezone is Australia/Sydneyif i run the below query my expectation is first column and third column should show the same value. But it is not working as expected for 1753-01-01 00:00:00 timestamp.spark.conf.set("spark.sql.session.timeZone...

image.png
  • 2882 Views
  • 4 replies
  • 5 kudos
Latest Reply
Pavithra_R
New Contributor II
  • 5 kudos

Hi @Rahul Lalwani​  (Customer)​,In Interactive cluster spark.sql.datetime.java8API.enabled is disabled when we enable spark.sql.datetime.java8API.enabled to true , we can see crt values for 1753-01-01 as well.The reason for enabling the above config ...

  • 5 kudos
3 More Replies
Mado
by Valued Contributor II
  • 4479 Views
  • 2 replies
  • 1 kudos

Error "Invalid configuration value detected for fs.azure.account.key" when listing files stored in an Azure Storage account using "dbutils.fs.ls"

I get the following error when getting a list of files stored in an Azure Storage account using "dbutils.fs.ls" command in Databrciks.Failure to initialize configuration for storage account AAAAA.dfs.core.windows.net: Invalid configuration value dete...

0 1 2 6
  • 4479 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Mohammad Saber​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 1 kudos
1 More Replies
marvin1
by New Contributor III
  • 242 Views
  • 1 replies
  • 0 kudos

Unable to configure compute of cloned job

If I clone an existing job without making any changes, I am able to reconfigure the compute successfully. If I remove or add a spark environment variable to test modifications, such as using secrets for example, and I confirm the changes to the job, ...

  • 242 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Marvin Ginns​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 0 kudos
david_torres
by New Contributor II
  • 2065 Views
  • 3 replies
  • 4 kudos

Can you use autoloader with a fixed width file?

I have a collection fixed width files that I would like to ingest monthly with autoloader but I can't seem to find an example. I can read the files into Dataframes using a python function to map the index and length of each field with no issues but ...

  • 2065 Views
  • 3 replies
  • 4 kudos
Latest Reply
david_torres
New Contributor II
  • 4 kudos

I found a way to get what I needed and I can apply this to any fixed width file. Will share for anyone trying to do the same thing. I accomplished this in a Python notebook and will explain the code:Import the libraries needed and define a schema.i...

  • 4 kudos
2 More Replies
g96g
by New Contributor III
  • 668 Views
  • 1 replies
  • 0 kudos

Read the file from datalake in databricks -No such file directory error

0I have a problem with reading the file from ADLS gen 2.I have dont the mounting properly as after executing dbutils.fs.ls('/mnt/bronze') I can see the file path.the way how I did the mounting:  # dbutils.fs.mount( # source = "abfss://"+container_r...

  • 668 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Givi Salu​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 0 kudos
alesventus
by New Contributor III
  • 1965 Views
  • 2 replies
  • 3 kudos

Pyspark Merge parquet and delta file

Is it possible to use merge command when source file is parquet and destination file is delta? Or both files must delta files? Currently, I'm using this code and I transform parquet into delta and it works. But I want to avoid of this tranformation.T...

  • 1965 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Ales ventus​ We haven't heard from you since the last response from @Kaniz Fatma​ , and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to others...

  • 3 kudos
1 More Replies
Anonymous
by Not applicable
  • 444 Views
  • 0 replies
  • 0 kudos

 HI ML Practitioners,  I want to ask you all how are you productionizing your ML workloads? Are you using ML Flow?Whats your take on ML Flow Recipies?...

 HI ML Practitioners, I want to ask you all how are you productionizing your ML workloads? Are you using ML Flow?Whats your take on ML Flow Recipies? Lets get the conversation started.MLflow Recipes (previously known as MLflow Pipelines) is a framewo...

Copy of 2023-05-Community-ongoing-announcement-1200x628
  • 444 Views
  • 0 replies
  • 0 kudos
PGrover
by New Contributor II
  • 1143 Views
  • 2 replies
  • 3 kudos

Connecting to Synapse database using AzureCliCredential token in Spark

I want to connect to my Azure Synapse database using Spark. I can do this in pyodbc no problem but that is not what I want.Here is how I get my credentialscredential = AzureCliCredential() databaseToken = credential.get_token('https://database.window...

  • 1143 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Patrick Grover​ We haven't heard from you since the last response from @Kaniz Fatma​ ​, and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to ot...

  • 3 kudos
1 More Replies
Soma
by Valued Contributor
  • 1237 Views
  • 3 replies
  • 4 kudos

View rocksdb statestore

I used rocksdb statestore for streaming and use collect_set to achieve de-dup logic.Is there any way I can convert rocksdb key value iterator to normal string as I need to validate how internally it is stored as I might need to store 50k distinct val...

  • 1237 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @somanath Sankaran​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you....

  • 4 kudos
2 More Replies
girl_chan
by New Contributor II
  • 690 Views
  • 1 replies
  • 0 kudos

Azure databricks API and DLT databricks

how can i pass parameter from Azure data factory rest web API to delta live Databricks?I get this error: "Py4JJavaError: An error occurred while calling o382.getArgument.: com.databricks.dbutils_v1.InputWidgetNotDefined: No input widget named *** def...

  • 690 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @milena chambe​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 0 kudos
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels