cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

lizou
by Contributor II
  • 2323 Views
  • 3 replies
  • 0 kudos

bug: add csv data UI: missing leading zero

use add data UI, add csv manually, even set data type as string, the leading zero will be missingexample csvval1,val20012345, abcafter load data, 123,abc is stored in table

image image
  • 2323 Views
  • 3 replies
  • 0 kudos
Latest Reply
lizou
Contributor II
  • 0 kudos

there are no issues using spark.read in notebooksthe issue is specific to using Add Data User interface and adding a csv file manually.

  • 0 kudos
2 More Replies
leandro
by New Contributor
  • 620 Views
  • 0 replies
  • 0 kudos

Connection from on-premise R sparklyr session to Databricks, " invalid method toDF for object 17/org.apache.spark.sql.DataFrameReader fields 0 selected 0"

Hello,I'm working with an on-premise R session and would like to connect to Databricks using sparklyr. RStudio server in this case is not an option.I would like to use jdbc. I tested RJDBC + DBI and can conect locally and perform operations. However,...

  • 620 Views
  • 0 replies
  • 0 kudos
DSam05
by New Contributor
  • 1518 Views
  • 3 replies
  • 1 kudos

10.4LTS has outdated snowflake spark connector, how to force latest snowflake spark connector

Hi,I am trying to run my code from a scala fatjar on azure-databricks, which connects to snowflake for the data.I usually run my jar on 9.1 LTS.However when I run on 10.4 LTS the performace was 4x degraded and in the log it says WARN SnowflakeConnect...

  • 1518 Views
  • 3 replies
  • 1 kudos
Latest Reply
slavat
New Contributor II
  • 1 kudos

I also encountered the similar problem. This is a snippet from my log file:22/12/18 09:36:28 WARN SnowflakeConnectorUtils$: Query pushdown is not supported because you are using Spark 3.2.0 with a connector designed to support Spark 3.1. Either use t...

  • 1 kudos
2 More Replies
SeliLi_52097
by New Contributor III
  • 1910 Views
  • 5 replies
  • 7 kudos

Resolved! 14-day free trial console showing blank page

I would like to register for a new 14-day free trial account as my existing one expires. I received the welcome email to validate my email address. I followed the link to set my password and it redirected me to the Databricks console page, but the pa...

Screen Shot 2023-01-05 at 12.15.41 pm
  • 1910 Views
  • 5 replies
  • 7 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 7 kudos

You can just log in to portal.azure.com and create a new databricks workspace, and there is an option 14 days premium free trial. I use that approach every time.

  • 7 kudos
4 More Replies
Riddhi
by New Contributor III
  • 949 Views
  • 2 replies
  • 2 kudos

Databricks Lakehouse Fundamentals Accreditation V2 Badge/Certificate Date Not Updated

I had appeared for Databricks Lakehouse Fundamentals Accreditation for both V1 and V2. Recently I came to know that when you take V1 test, and you get a badge and certificate - once you take V2 test, it updates that same badge and certificate to the ...

  • 949 Views
  • 2 replies
  • 2 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 2 kudos

yes raise a case they will update you soon

  • 2 kudos
1 More Replies
yzaehringer
by New Contributor
  • 1164 Views
  • 1 replies
  • 0 kudos

GET_COLUMNS fails with Unexpected character (\\'t\\' (code 116)): was expecting comma to separate Object entries - how to fix?

I just run `cursor.columns()` via the python client and I'll get back a `org.apache.hive.service.cli.HiveSQLException` as response. There is also a long stack trace, I'll just paste the last bit because it might be illuminating: org.apache.spark.sql....

  • 1164 Views
  • 1 replies
  • 0 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 0 kudos

this can be package issue or runtime issue, try to change both

  • 0 kudos
BigMF
by New Contributor III
  • 2402 Views
  • 2 replies
  • 1 kudos

Resolved! Can I use Widgets in a Delta Live Table pipeline

Hello, I'm pretty new to Databricks in general and Delta Live Tables specifically. My problem statement is that I'd like loop through a set of files and run a notebook that loads the data into some Delta Live Tables. Additionally, I'd like to include...

image image
  • 2402 Views
  • 2 replies
  • 1 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 1 kudos

may be it is possible in the job run file there will be configurations option the user can add there

  • 1 kudos
1 More Replies
timothy_hartant
by New Contributor
  • 1008 Views
  • 3 replies
  • 1 kudos

Resolved! Databricks Certified Machine Learning Associate Badge not Received Yet

I have recently passed my Databricks Certified Machine Learning Associate exam on Tuesday (04/01) and still have not received my badge on accredible website.Please advise.

  • 1008 Views
  • 3 replies
  • 1 kudos
Latest Reply
Chaitanya_Raju
Honored Contributor
  • 1 kudos

@Timothy Hartanto​ First of all congratulations on your achievement, you will be receiving your certificate and the badge to the registered mail address in 24-48 hours post-completion of your examination. Hope this helps!!All the very best for your f...

  • 1 kudos
2 More Replies
Manojkumar
by New Contributor II
  • 2424 Views
  • 4 replies
  • 0 kudos

Can we assigee default value in select columns in Spark sql when the column is not present?

Im reading avro file and loading into table. The avro data is nested data.Now from this table im trying to extract the necessary elements using spark sql. Using explode function when there is array data. Now the challenge is there are cases like the ...

  • 2424 Views
  • 4 replies
  • 0 kudos
Latest Reply
UmaMahesh1
Honored Contributor III
  • 0 kudos

Hi @manoj kumar​ An easiest way would be to make use of unmanaged delta tables and while loading data into the path of that table, you can enable mergeSchema to be true. This handles all the schema differences, incase column is not present as null an...

  • 0 kudos
3 More Replies
julie
by New Contributor III
  • 2342 Views
  • 5 replies
  • 3 kudos

Resolved! Scope creation in Databricks or Confluent?

Hello I am a newbie in this field and trying to access confluent kafka stream in Databricks Azure based on a beginner's video by Databricks. I have a free trial of Databricks cluster right now. When I run the below notebook, it errors out on line 5 o...

image
  • 2342 Views
  • 5 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

For testing, create without secret scope. It will be unsafe, but you can post secrets as strings in the notebook for testing. Here is the code which I used for loading data from confluent:inputDF = (spark .readStream .format("kafka") .option("kafka.b...

  • 3 kudos
4 More Replies
gbradley145
by New Contributor III
  • 2534 Views
  • 3 replies
  • 4 kudos

Why does Databricks SQL drop ending 0 in decimal data type

All,I have a column, RateAdj that is defined as DECIMAL(15,5) and I can see that the value is 4.00000, but when this gets inserted into my table it shows as just 4.%sql   SELECT LTRIM(RTRIM(IFNULL(FORMAT_NUMBER(RateADJ, '0.00000'), '0.00000')))This i...

  • 2534 Views
  • 3 replies
  • 4 kudos
Latest Reply
silvathomas
New Contributor II
  • 4 kudos

The value goes to 10,000 values and having the things done to run a fast execution, and I am also Sociology Dissertation Help with the reduction of pages.

  • 4 kudos
2 More Replies
galop12
by New Contributor
  • 2246 Views
  • 3 replies
  • 0 kudos

Databricks workspace (with managed VNET) upgrade to premium failing

I am trying to upgrade our Databricks workspace from standard to premium but running into issues. The workspace is currently deployed in a managed VNET.I tried the migration tool as well as just re-creating a premium workspace with the same parameter...

  • 2246 Views
  • 3 replies
  • 0 kudos
Latest Reply
lskw
New Contributor II
  • 0 kudos

Hi, I have same situation when trying to upgrade from Standard to Premium on Azure.My error: "ConflictWithNetworkIntentPolicy","message":"Found conflicts with NetworkIntentPolicy. Details: Subnet or Virtual Network cannot have resources or properties...

  • 0 kudos
2 More Replies
CaseyTercek_
by New Contributor II
  • 627 Views
  • 2 replies
  • 1 kudos

Lineage - It would be nice if the lineage in Unity would allow for API calls that could add additional lineage information, somehow. I am not certain...

Lineage - It would be nice if the lineage in Unity would allow for API calls that could add additional lineage information, somehow. I am not certain exactly what would be nice. But some sort of feature to include source systems in it.

  • 627 Views
  • 2 replies
  • 1 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 1 kudos

Pureview is quite popular to be integrated to solve this issue. I think lineage in the unity catalog is designed to be auto-generated. I know there are information tables, but I never manually manipulated them.

  • 1 kudos
1 More Replies
vanessafvg
by New Contributor III
  • 2363 Views
  • 3 replies
  • 1 kudos

linking filters from different Databricks SQL queries in a Dashboard

I am having to use Databricks SQL dashboard for some analysis, it seems very clunky. If i have multiple queries, is it possible to apply the same filters to all the queries in the dashboard or do i have to duplicate the filters for each query in the ...

  • 2363 Views
  • 3 replies
  • 1 kudos
Latest Reply
FelixH
New Contributor II
  • 1 kudos

Same issue here. According the docs, using query filters with the same name and values should result in a single dashboard filter. However, filters are duplicated. I also tried using this setting but no success

  • 1 kudos
2 More Replies
sher
by Valued Contributor II
  • 1001 Views
  • 3 replies
  • 1 kudos

Resolved! Do we have any certificate voucher for the data bricks session in the upcoming days

Hi Team,Do we have any program for certificate vouchers for the data bricks session in upcoming days

  • 1001 Views
  • 3 replies
  • 1 kudos
Latest Reply
sher
Valued Contributor II
  • 1 kudos

@Vidula Khanna​ I got this link for the certificate voucher register link.https://docs.google.com/presentation/d/1sy5hSSnFtncrpYY1EYi0WMsDkJK0dYk9iKBAeeAha8E/edit#slide=id.g1ade45a9cd6_0_543

  • 1 kudos
2 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels