cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

sunil_smile
by Contributor
  • 2297 Views
  • 2 replies
  • 1 kudos

Vnet peering settings is not enable in Azure databricks premium , even though its deployed inside my VNET?

Hi All,Vnet peering settings is not enabled in Azure databricks , even though its deployed inside my VNET?Here i not mentioned my vnet and subnet details , but filled this and created databricks (without private endpoint - allow public access)virtual...

image image image
  • 2297 Views
  • 2 replies
  • 1 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 1 kudos

Hi, VNET peering is not supported or possible on VNET-injected workspaces. Please refer: https://learn.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/vnet-peering#requirements

  • 1 kudos
1 More Replies
patdev
by New Contributor III
  • 917 Views
  • 2 replies
  • 2 kudos

load new data in delta table

Hello all,I want to know how to update new data in delta table from new csv file.here is the code that i have used to create delta table from csv file and loaded data. but i have go new updated file and trying to load new data but not able to any gui...

  • 917 Views
  • 2 replies
  • 2 kudos
Latest Reply
patdev
New Contributor III
  • 2 kudos

Thank you, i tried that and it ended in error, the table created with delta are from csv which must have converted to parquet file and all the columns are varchar or string. so not if i want to entered new file it ends in incmopatibility error for da...

  • 2 kudos
1 More Replies
sunil_smile
by Contributor
  • 3559 Views
  • 9 replies
  • 11 kudos

Resolved! How i can add ADLS Gen2 - OAuth 2.0 as Cluster scope for my High concurrency Shared Cluster (without unity catalog)?

Hi All,Kindly help me , how i can add the ADLS gen2 OAuth 2.0 authentication to my high concurrency shared cluster. I want to scope this authentication to entire cluster not for particular notebook.Currently i have added them as spark configuration o...

image.png image
  • 3559 Views
  • 9 replies
  • 11 kudos
Latest Reply
Kaniz
Community Manager
  • 11 kudos

Hi @Sunilprasath Elangovan​ , We haven’t heard from you since the last response from @Hubert Dudek​​, and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please do share that with the community as it can b...

  • 11 kudos
8 More Replies
JesseS
by New Contributor
  • 2420 Views
  • 2 replies
  • 1 kudos

Resolved! How to extract source data from on-premise databases into a data lake and load with AutoLoader?

Here is the situation I am working with. I am trying to extract source data using Databricks JDBC connector using SQL Server databases as my data source. I want to write those into a directory in my data lake as JSON files, then have AutoLoader ing...

  • 2420 Views
  • 2 replies
  • 1 kudos
Latest Reply
Aashita
Contributor III
  • 1 kudos

To add to @werners point, I would use ADF to load SQL server data into ADLS Gen 2 as json. Then Load these Raw Json files from your ADLS base location into a Delta table using Autoloader.Delta Live Tables can be used in this scenario.You can also reg...

  • 1 kudos
1 More Replies
databicky
by Contributor II
  • 475 Views
  • 1 replies
  • 0 kudos

Resolved! How to create border for sme specific cells?

i tried some code to create border for excel sheet, for particular cell iam able to write but while i am trying with some set of cells means it is showing error.​

  • 475 Views
  • 1 replies
  • 0 kudos
Latest Reply
Chaitanya_Raju
Honored Contributor
  • 0 kudos

Hi @Mohammed sadamusean​ ,Can you please try similar to below code using loops, I have implemented a similar use case that might be useful, please let me know if you need further top = Side(border_style = 'thin',color = '00000000') bottom = Side(bor...

  • 0 kudos
sreedata
by New Contributor III
  • 2060 Views
  • 5 replies
  • 9 kudos

Resolved! Getting status of "If Condition" Activity into a variable

"If Condition" has lot of activities that can succeeded or fail. If any activity fails then whole "If Condition" fails. I have to get the status of the "If Condition" activity (pass or fail) so that i can use it for processing in the next notebook t...

  • 2060 Views
  • 5 replies
  • 9 kudos
Latest Reply
Kaniz
Community Manager
  • 9 kudos

Hi @srikanth nair​, We haven’t heard from you since the last response from @Hubert Dudek​ and @Uma Maheswara Rao Desula​​, and I was checking back to see if their suggestions helped you. Or else, If you have any solution, share that with the communit...

  • 9 kudos
4 More Replies
Nhan_Nguyen
by Valued Contributor
  • 4668 Views
  • 15 replies
  • 27 kudos

Resolved! Do not received Databricks Certification: Fully Sponsored after order on Reward Store

Hi team.Would you please help check on my case?From 30-Nov I have placed an order "Databricks Certification: Fully Sponsored" on https://communitydatabricks.mybrightsites.com/ and after waiting 10 bussiness days. I still not receive that voucher.Is t...

  • 4668 Views
  • 15 replies
  • 27 kudos
Latest Reply
ramravi
Contributor II
  • 27 kudos

I receive a voucher today and it saysPlease find your code here (expires 6/1/23):Does it mean it expires on 1-June-2023 ?

  • 27 kudos
14 More Replies
Chanu
by New Contributor II
  • 988 Views
  • 2 replies
  • 2 kudos

Databricks JAR task type functionality

Hi, I would like to understand Databricks JAR based workflow tasks. Can I interpret JAR based runs to be something like a spark-submit on a cluster? In the logs, I was expecting to see the spark-submit --class com.xyz --num-executors 4 etc., And, the...

  • 988 Views
  • 2 replies
  • 2 kudos
Latest Reply
Chanu
New Contributor II
  • 2 kudos

Hi, I did try using the Workflows>Jobs>CreateTask>JarTaskType>UploadedMyJAR and Class and created JobCluster and tested this task. This JAR reads some tables as input, does some transformations and output as writing some other tables. I would like t...

  • 2 kudos
1 More Replies
sudhanshu1
by New Contributor III
  • 1620 Views
  • 4 replies
  • 2 kudos

Resolved! DLT workflow failing to read files from AWS S3

Hi All, I am trying to read streams directly from AWS S3. I set the instance profile , but when i run the workflow it fails with below error"No AWS Credentials provided by TemporaryAWSCredentialsProvider : shaded.databricks.org.apache.hadoop.fs.s3a.C...

  • 1620 Views
  • 4 replies
  • 2 kudos
Latest Reply
Vivian_Wilfred
Honored Contributor
  • 2 kudos

Hi @SUDHANSHU RAJ​ is UC enabled on this workspace? What is the access mode set on the cluster? Is this coming from the metastore or directly when you read from S3? Is the S3 cross-account?

  • 2 kudos
3 More Replies
alxsbn
by New Contributor III
  • 1185 Views
  • 2 replies
  • 2 kudos

Resolved! Autloader on CSV file didn't infer well cell with JSON data

Hello ! I playing with autoloader schema inference on a big S3 repo with +300 tables and large CSV files. I'm looking at autoloader with great attention, as it can be a great time saver on our ingestion process (data comes from a transactional DB gen...

  • 1185 Views
  • 2 replies
  • 2 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 2 kudos

PySpark by default is using \ as an escape character. You can change it to "Doc: https://docs.databricks.com/ingestion/auto-loader/options.html#csv-options

  • 2 kudos
1 More Replies
Victhor
by New Contributor III
  • 4197 Views
  • 4 replies
  • 8 kudos
  • 4197 Views
  • 4 replies
  • 8 kudos
Latest Reply
chanshing
New Contributor II
  • 8 kudos

@Kaniz Fatma​ Is that tool (dbvim) still maintained? It looks like it has been abandoned and there are a couple of unresolved issues.Are there any plans to support vim keybindings in Databricks? This is possible in many other web-based editors such a...

  • 8 kudos
3 More Replies
DeveloperAmarde
by New Contributor
  • 716 Views
  • 1 replies
  • 0 kudos

Connection to Collibra

Hi Team,I want to connect to collibra to fetch details from Collibra.Currently we are using username and password to connect.I want to know recommended practice to connect Collibra account from databricks notebook.

  • 716 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, Could you please know if this helps. https://marketplace.collibra.com/listings/jdbc-driver-for-databricks/

  • 0 kudos
lmcglone
by New Contributor II
  • 2440 Views
  • 2 replies
  • 3 kudos

Comparing 2 dataframes and create columns from values within a dataframe

Hi,I have a dataframe that has name and companyfrom pyspark.sql import SparkSessionspark = SparkSession.builder.appName('SparkByExamples.com').getOrCreate()columns = ["company","name"]data = [("company1", "Jon"), ("company2", "Steve"), ("company1", "...

image
  • 2440 Views
  • 2 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

You need to join and pivotdf .join(df2, on=[df.company == df2.job_company])) .groupBy("company", "name") .pivot("job_company") .count()

  • 3 kudos
1 More Replies
andrew0117
by Contributor
  • 4959 Views
  • 1 replies
  • 0 kudos

Resolved! How to read a local file using Databricks( file stored in your own computer)

without uploading the file into dbfs? Thanks!

  • 4959 Views
  • 1 replies
  • 0 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 0 kudos

In my opinion, it doesn't make sense, but...you can Mount SMB Azure file share on a Windows Machine https://learn.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-windows and then mount the same folder on databricks using pip install ...

  • 0 kudos
rams
by Contributor
  • 1029 Views
  • 4 replies
  • 4 kudos

Resolved! 14 day trial version console showing blank screen after login

I have taken a trial version of Databricks and wanted to configure it with AWS. but after login it was showing as blank screen since 20 hours. can someone help me with this. Note: strictly i have to use AWS with Databricks for configuration.

  • 1029 Views
  • 4 replies
  • 4 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 4 kudos

try to reach your account manager

  • 4 kudos
3 More Replies
Labels
Top Kudoed Authors