cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

lizou
by Contributor III
  • 6322 Views
  • 3 replies
  • 0 kudos

bug: add csv data UI: missing leading zero

use add data UI, add csv manually, even set data type as string, the leading zero will be missingexample csvval1,val20012345, abcafter load data, 123,abc is stored in table

image image
  • 6322 Views
  • 3 replies
  • 0 kudos
Latest Reply
lizou
Contributor III
  • 0 kudos

there are no issues using spark.read in notebooksthe issue is specific to using Add Data User interface and adding a csv file manually.

  • 0 kudos
2 More Replies
leandro
by Databricks Partner
  • 2580 Views
  • 0 replies
  • 0 kudos

Connection from on-premise R sparklyr session to Databricks, " invalid method toDF for object 17/org.apache.spark.sql.DataFrameReader fields 0 selected 0"

Hello,I'm working with an on-premise R session and would like to connect to Databricks using sparklyr. RStudio server in this case is not an option.I would like to use jdbc. I tested RJDBC + DBI and can conect locally and perform operations. However,...

  • 2580 Views
  • 0 replies
  • 0 kudos
DSam05
by New Contributor
  • 3505 Views
  • 2 replies
  • 1 kudos

10.4LTS has outdated snowflake spark connector, how to force latest snowflake spark connector

Hi,I am trying to run my code from a scala fatjar on azure-databricks, which connects to snowflake for the data.I usually run my jar on 9.1 LTS.However when I run on 10.4 LTS the performace was 4x degraded and in the log it says WARN SnowflakeConnect...

  • 3505 Views
  • 2 replies
  • 1 kudos
Latest Reply
slavat
New Contributor II
  • 1 kudos

I also encountered the similar problem. This is a snippet from my log file:22/12/18 09:36:28 WARN SnowflakeConnectorUtils$: Query pushdown is not supported because you are using Spark 3.2.0 with a connector designed to support Spark 3.1. Either use t...

  • 1 kudos
1 More Replies
SeliLi_52097
by New Contributor III
  • 5635 Views
  • 5 replies
  • 7 kudos

Resolved! 14-day free trial console showing blank page

I would like to register for a new 14-day free trial account as my existing one expires. I received the welcome email to validate my email address. I followed the link to set my password and it redirected me to the Databricks console page, but the pa...

Screen Shot 2023-01-05 at 12.15.41 pm
  • 5635 Views
  • 5 replies
  • 7 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 7 kudos

You can just log in to portal.azure.com and create a new databricks workspace, and there is an option 14 days premium free trial. I use that approach every time.

  • 7 kudos
4 More Replies
rams
by Contributor
  • 3723 Views
  • 3 replies
  • 4 kudos

Resolved! 14 day trial version console showing blank screen after login

I have taken a trial version of Databricks and wanted to configure it with AWS. but after login it was showing as blank screen since 20 hours. can someone help me with this. Note: strictly i have to use AWS with Databricks for configuration.

  • 3723 Views
  • 3 replies
  • 4 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 4 kudos

try to reach your account manager

  • 4 kudos
2 More Replies
Riddhi
by Databricks Partner
  • 2389 Views
  • 2 replies
  • 2 kudos

Databricks Lakehouse Fundamentals Accreditation V2 Badge/Certificate Date Not Updated

I had appeared for Databricks Lakehouse Fundamentals Accreditation for both V1 and V2. Recently I came to know that when you take V1 test, and you get a badge and certificate - once you take V2 test, it updates that same badge and certificate to the ...

  • 2389 Views
  • 2 replies
  • 2 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 2 kudos

yes raise a case they will update you soon

  • 2 kudos
1 More Replies
yzaehringer
by New Contributor
  • 3005 Views
  • 1 replies
  • 0 kudos

GET_COLUMNS fails with Unexpected character (\\'t\\' (code 116)): was expecting comma to separate Object entries - how to fix?

I just run `cursor.columns()` via the python client and I'll get back a `org.apache.hive.service.cli.HiveSQLException` as response. There is also a long stack trace, I'll just paste the last bit because it might be illuminating: org.apache.spark.sql....

  • 3005 Views
  • 1 replies
  • 0 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 0 kudos

this can be package issue or runtime issue, try to change both

  • 0 kudos
BigMF
by New Contributor III
  • 7279 Views
  • 2 replies
  • 1 kudos

Resolved! Can I use Widgets in a Delta Live Table pipeline

Hello, I'm pretty new to Databricks in general and Delta Live Tables specifically. My problem statement is that I'd like loop through a set of files and run a notebook that loads the data into some Delta Live Tables. Additionally, I'd like to include...

image image
  • 7279 Views
  • 2 replies
  • 1 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 1 kudos

may be it is possible in the job run file there will be configurations option the user can add there

  • 1 kudos
1 More Replies
KKo
by Contributor III
  • 3781 Views
  • 2 replies
  • 2 kudos

delete and append in delta path

I am deleting data from curated path based on date column and appending staged data on it on each run, using below script. My fear is, just after the delete operation, if any network issue appeared and the job stopped before it appended the staged da...

  • 3781 Views
  • 2 replies
  • 2 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 2 kudos

thanks man

  • 2 kudos
1 More Replies
timothy_hartant
by Databricks Partner
  • 2918 Views
  • 3 replies
  • 1 kudos

Resolved! Databricks Certified Machine Learning Associate Badge not Received Yet

I have recently passed my Databricks Certified Machine Learning Associate exam on Tuesday (04/01) and still have not received my badge on accredible website.Please advise.

  • 2918 Views
  • 3 replies
  • 1 kudos
Latest Reply
Chaitanya_Raju
Honored Contributor
  • 1 kudos

@Timothy Hartanto​ First of all congratulations on your achievement, you will be receiving your certificate and the badge to the registered mail address in 24-48 hours post-completion of your examination. Hope this helps!!All the very best for your f...

  • 1 kudos
2 More Replies
Manojkumar
by New Contributor II
  • 6120 Views
  • 4 replies
  • 0 kudos

Can we assigee default value in select columns in Spark sql when the column is not present?

Im reading avro file and loading into table. The avro data is nested data.Now from this table im trying to extract the necessary elements using spark sql. Using explode function when there is array data. Now the challenge is there are cases like the ...

  • 6120 Views
  • 4 replies
  • 0 kudos
Latest Reply
UmaMahesh1
Honored Contributor III
  • 0 kudos

Hi @manoj kumar​ An easiest way would be to make use of unmanaged delta tables and while loading data into the path of that table, you can enable mergeSchema to be true. This handles all the schema differences, incase column is not present as null an...

  • 0 kudos
3 More Replies
julie
by New Contributor III
  • 6699 Views
  • 5 replies
  • 3 kudos

Resolved! Scope creation in Databricks or Confluent?

Hello I am a newbie in this field and trying to access confluent kafka stream in Databricks Azure based on a beginner's video by Databricks. I have a free trial of Databricks cluster right now. When I run the below notebook, it errors out on line 5 o...

image
  • 6699 Views
  • 5 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 3 kudos

For testing, create without secret scope. It will be unsafe, but you can post secrets as strings in the notebook for testing. Here is the code which I used for loading data from confluent:inputDF = (spark .readStream .format("kafka") .option("kafka.b...

  • 3 kudos
4 More Replies
galop12
by New Contributor
  • 6294 Views
  • 3 replies
  • 0 kudos

Databricks workspace (with managed VNET) upgrade to premium failing

I am trying to upgrade our Databricks workspace from standard to premium but running into issues. The workspace is currently deployed in a managed VNET.I tried the migration tool as well as just re-creating a premium workspace with the same parameter...

  • 6294 Views
  • 3 replies
  • 0 kudos
Latest Reply
lskw
New Contributor II
  • 0 kudos

Hi, I have same situation when trying to upgrade from Standard to Premium on Azure.My error: "ConflictWithNetworkIntentPolicy","message":"Found conflicts with NetworkIntentPolicy. Details: Subnet or Virtual Network cannot have resources or properties...

  • 0 kudos
2 More Replies
CaseyTercek_
by New Contributor II
  • 1820 Views
  • 2 replies
  • 1 kudos

Lineage - It would be nice if the lineage in Unity would allow for API calls that could add additional lineage information, somehow. I am not certain...

Lineage - It would be nice if the lineage in Unity would allow for API calls that could add additional lineage information, somehow. I am not certain exactly what would be nice. But some sort of feature to include source systems in it.

  • 1820 Views
  • 2 replies
  • 1 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 1 kudos

Pureview is quite popular to be integrated to solve this issue. I think lineage in the unity catalog is designed to be auto-generated. I know there are information tables, but I never manually manipulated them.

  • 1 kudos
1 More Replies
Labels