cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

vichus1995
by New Contributor
  • 1118 Views
  • 2 replies
  • 0 kudos

Mounted Azure Storage shows mount.err inside folder while reading from Azure Databricks

I'm using Azure Databricks notebook to read a excel file from a folder inside a mounted Azure blob storage. The mounted excel location is like : "/mnt/2023-project/dashboard/ext/Marks.xlsx". 2023-project is the mount point and dashboard is the name o...

  • 1118 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @vichus1995​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 0 kudos
1 More Replies
Jits
by New Contributor II
  • 525 Views
  • 2 replies
  • 3 kudos

Getting Error when Inserting data into table with the column as bigint

Hi All,I am creating table using Databricks SQL editor. The table definition isDROP TABLE IF EXISTS [database].***_test;CREATE TABLE [database].***_jitu_test(  id bigint)USING deltaLOCATION 'test/raw/***_jitu_test'TBLPROPERTIES ('delta.minReaderVersi...

  • 525 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @jitendra goswami​ We haven't heard from you since the last response from @Werner Stinckens​ r​, and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpf...

  • 3 kudos
1 More Replies
jhgorse
by New Contributor III
  • 1079 Views
  • 2 replies
  • 1 kudos

Resolved! autoloader break on migration from community to trial premium with s3 mount

in dbx community edition, the autoloader works using the s3 mount. s3 mount, autoloader:dbutils.fs.mount(f"s3a://{access_key}:{encoded_secret_key}@{aws_bucket_name}", f"/mnt/{mount_name}from pyspark.sql import SparkSession from pyspark.sql.functions ...

  • 1079 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Joe Gorse​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers you...

  • 1 kudos
1 More Replies
signo
by New Contributor II
  • 1131 Views
  • 3 replies
  • 2 kudos

Delta lake schema enforcement allows datatype mismatch on write using MERGE-operation [python]

Databricks Runtime: 12.2 LTS, Spark: 3.3.2, Delta Lake: 2.2.0A target table with schema ([c1: integer, c2: integer]), allows us to write into target table using data with schema ([c1: integer, c2: double]). I expected it to throw an exception (same a...

  • 1131 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Sigrun Nordli​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers...

  • 2 kudos
2 More Replies
kll
by New Contributor III
  • 2710 Views
  • 2 replies
  • 3 kudos

Nested struct type not supported pyspark error

I am attempting to apply a function to a pyspark DataFrame and save the API response to a new column and then parse using `json_normalize`. This works fine in pandas, however, I run into an exception with `pyspark`.  import pyspark.pandas as ps   i...

  • 2710 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Keval Shah​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers yo...

  • 3 kudos
1 More Replies
Divya_Bhadauria
by New Contributor II
  • 1343 Views
  • 2 replies
  • 2 kudos

Running databricks job with different parameter automatically

I have a python script running as databricks job. Is there a way I can run this job with different set of parameters automatically or programmatically without using run with different parameter option available in UI ?

  • 1343 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Divya Bhadauria​ We haven't heard from you since the last response from @Lakshay Goel​ â€‹, and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to ...

  • 2 kudos
1 More Replies
Khalil
by Contributor
  • 1214 Views
  • 0 replies
  • 0 kudos

Snowpark vs Spark on Databricks

Why / When should we choose Spark on Databricks over Snowpark if the data we are processing is underlying in Snowflake?

  • 1214 Views
  • 0 replies
  • 0 kudos
Gilg
by Contributor II
  • 611 Views
  • 0 replies
  • 0 kudos

Databricks Runtime 12.1 spins VM in Ubuntu 18.04 LTS

Hi Team,Our cluster is currently in DBR 12.1 but it spins up a VMs with Ubuntu 18.04 LTS. 18.04 will be EOL soon. According to this https://learn.microsoft.com/en-us/azure/databricks/release-notes/runtime/12.1 OS version should be 20.04 and now a bit...

  • 611 Views
  • 0 replies
  • 0 kudos
David_K93
by Contributor
  • 1508 Views
  • 1 replies
  • 2 kudos

Resolved! Building a Document Store on Databricks

Hello,I am somewhat new to Databricks and am trying to build a Q&A application based on a collection of documents. I need to move .pdf and .docx files from my local machine to storage in Databricks and eventually a document store. My questions are:Wh...

  • 1508 Views
  • 1 replies
  • 2 kudos
Latest Reply
David_K93
Contributor
  • 2 kudos

Hi all,I took an initial stab at task one with some success using the Databricks CLI. Here are the steps below:Open Command/Anaconda prompt and enter: pip install databricks-cliGo to your Databricks console and under settings find "User Settings" and...

  • 2 kudos
guostong
by New Contributor III
  • 897 Views
  • 3 replies
  • 1 kudos

Issues to load from ADLS in DLT

I am using DLT to load csv in ADLS, below is my sql query in notebook:CREATE OR REFRESH STREAMING LIVE TABLE test_account_raw AS SELECT * FROM cloud_files( "abfss://my_container@my_storageaccount.dfs.core.windows.net/test_csv/", "csv", map("h...

  • 897 Views
  • 3 replies
  • 1 kudos
Latest Reply
guostong
New Contributor III
  • 1 kudos

thank you every one, the problem is resolved, problem is gone when I have workspace admin access.

  • 1 kudos
2 More Replies
Ashwathy
by New Contributor II
  • 3506 Views
  • 3 replies
  • 3 kudos

Facing issue while using widget values in sql script

I am using below code to create and read widgets. I am assigning default value.dbutils.widgets.text("pname", "default","parameter_name")pname=dbutils.widgets.get("pname")I am using this widget parameter in some sql scripts. one example is given below...

  • 3506 Views
  • 3 replies
  • 3 kudos
Latest Reply
Kaniz
Community Manager
  • 3 kudos

Hi @Ashwathy P P​ , Which Databricks Runtime are you using?A known issue is that a widget state may not be adequately clear after pressing Run All, even after clearing or removing the widget in the code. If this happens, you will see a discrepancy be...

  • 3 kudos
2 More Replies
Learning
by New Contributor
  • 749 Views
  • 2 replies
  • 0 kudos

Databricks extension in Visual studio Code

Databricks recently introduced extension inside the VS code, this is good feature but my company has some security concerns, If I wanted to block connecting to Databricks from Visual studio code, How can I do it? Is there any process which blocks con...

  • 749 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Kiran Gogula​ We haven't heard from you since the last response from @Debayan Mukherjee​ â€‹, and I was checking back to see if her suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful t...

  • 0 kudos
1 More Replies
BWong
by New Contributor III
  • 2449 Views
  • 2 replies
  • 1 kudos

Overwriting schema in Delta Live Tables

Hi allI have a table created by DLT. Initially I specified cloudFiles.inferColumnTypes to false and all columns are stored as strings. However, I now want to use cloudFiles.inferColumnTypes=true. I dropped the table and re-ran the pipeline, which fai...

  • 2449 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Billy Wong​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers yo...

  • 1 kudos
1 More Replies
Labels
Top Kudoed Authors