- 175 Views
- 0 replies
- 0 kudos
TheINFO.pk is the premier and most trustworthy source of information in Pakistan. We provides latest news, updates, jobs, Results, Education, telecom and many more stuff those you love.
- 175 Views
- 0 replies
- 0 kudos
by
RantoB
• Valued Contributor
- 9170 Views
- 6 replies
- 9 kudos
Hi,I have the following error :Error: b'{"error_code":"TEMPORARILY_UNAVAILABLE","message":"The service at /api/2.0/workspace/get-status is temporarily unavailable. Please try again later."}'when I do :databricks workspace export_dir path .ordatabrick...
- 9170 Views
- 6 replies
- 9 kudos
Latest Reply
Please try to reconfigure cli. Please double check databricks hostdatabricks configure --tokenRegarding second command which you shared (%sh ls /Workspace) it will not work on free community edition. There you can use only native function like - dbu...
5 More Replies
by
RS1
• New Contributor III
- 1774 Views
- 11 replies
- 9 kudos
@Kaniz Fatma​ I attended the Advanced Machine Learning with Databricks training last week virtually I am still unable to get the day 2 session videos of any of the Instructor led Paid Trainings. They are supposed to be available for replay with in 24...
- 1774 Views
- 11 replies
- 9 kudos
Latest Reply
Hi @Kaniz Fatma​ , they uploaded the full video for Advanced Machine Learning with Databricks course day 2, Thank you for the follow up. but still we have the same issue with Apache Spark Programming with Databricks - Bundle: Day 2 Training . can you...
10 More Replies
- 1178 Views
- 2 replies
- 1 kudos
Hello friends,I have a DataFrame with specific values. I am trying to find specific values out of it. *I/P -|ID | text ||:--|:------||1 | select distinct Col1 as OrderID from Table1 WHERE ( (Col3 Like '%ABC%') OR (Col3 Like '%DEF%') OR (Col3 Like '...
- 1178 Views
- 2 replies
- 1 kudos
Latest Reply
What is the logic for substring function?Can't you use str1[idxi+14:3] for substring?
1 More Replies
- 1648 Views
- 4 replies
- 0 kudos
Hey there Community!! I'm using dlt.apply_changes in my DLT job as follows:dlt.apply_changes( target = "employee_silver", source = "employee_bronze_clean_v", keys = ["EMPLOYEE_ID"], sequence_by = col("last_updated"), apply_as_deletes = expr("Op ...
- 1648 Views
- 4 replies
- 0 kudos
Latest Reply
First try expr("Operation = 'DELETE'") for your apply_as_deletes
3 More Replies
by
leon
• New Contributor II
- 1924 Views
- 2 replies
- 1 kudos
Hello,I am using querying my Delta Lake with SQL Connect and later want to explore the result in pandas.with connection.cursor() as cursor:
cur = cursor.execute("""
SELECT DISTINCT sample_timestamp, value, name
FROM de...
- 1924 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Leon Bam​, Please check this article and let us know if that helps.
1 More Replies
- 4650 Views
- 26 replies
- 4 kudos
Use Case Sharing Sweepstakes ! Data + AI Summit is in full swing and we know you are just as excited as we are to learn about the new and exciting things happening at Databricks. From notebooks to the Lakehouse, we know some of these new features wil...
- 4650 Views
- 26 replies
- 4 kudos
Latest Reply
Cloning libraries when cloning clustersCurrently when we clone clusters, the externally added libraries aren't copied as part of cloning process.It's an expected behavior but a missing one. At times new developers end up spending lot of time in debug...
25 More Replies
- 4101 Views
- 1 replies
- 0 kudos
When attempting to deploy/start an Databricks cluster on AWS through the UI, the following error consistently occurs:Bootstrap Timeout:[id: InstanceId(i-093caac78cdbfa7e1), status: INSTANCE_INITIALIZING, workerEnvId:WorkerEnvId(workerenv-335698072713...
- 4101 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Junaid Ahmed​, Nice to meet you, and Thank you for asking me this question. We have had a similar issue in the past and got the best answer too on it.Please see this community thread with the same question. Please let us know if that helps you.
- 5063 Views
- 2 replies
- 12 kudos
In databricks jobs, there's a field to add concurrent runs which can be set to 1000.If I've a cluster with 4 worker nodes and 8 cores each, then at max how many concurrent jobs I'll be able to execute?What will happen if I launch 100 instances of sam...
- 5063 Views
- 2 replies
- 12 kudos
Latest Reply
@Aman Sehgal​ On E2 workspace the limit is 1000 concurrent runs. If you trigger 100 runs​ at the same time, 100 clusters will be created and the runs will be executed. If you use the same cluster for 100 runs, then you might face a lot of failed jobs...
1 More Replies
- 3158 Views
- 1 replies
- 1 kudos
In the release notes of May 2022 it says that we are now able to investigate our SQL results in python in a python notebook. (See also documentation here: Use notebooks - Azure Databricks | Microsoft Docs ) So I created a simple query (select * from ...
- 3158 Views
- 1 replies
- 1 kudos
Latest Reply
This feature was delayed and will be rolled out over Databricks platform releases 3.74 through 3.76. you can check the release notes for more info --> https://docs.databricks.com/release-notes/product/2022/may.html
- 5944 Views
- 7 replies
- 2 kudos
Hi AllI am loading some data using auto loader but am having trouble with Schema evolution.A new column has been added to the data I am loading and I am getting the following error:StreamingQueryException: Encountered unknown field(s) during parsing:...
- 5944 Views
- 7 replies
- 2 kudos
Latest Reply
I agree that hints are the way to go if you have the schema available but the whole point of schema evolution is that you might not always know the schema in advance.I received a similar error with a similar streaming query configuration. The issue w...
6 More Replies
- 1109 Views
- 2 replies
- 3 kudos
Is there a way to create a generic user account and personal access token to connect to databricks. I have Azure build pipeline and VSCode test that is using my personal access token for running builds and tests.
- 1109 Views
- 2 replies
- 3 kudos
Latest Reply
You can create a service account (principle) for jobs, applications etc. Here's a link to the docs:https://docs.databricks.com/administration-guide/users-groups/service-principals.html
1 More Replies
- 1151 Views
- 5 replies
- 2 kudos
I am trying to set up audit log delivery in google cloud. I have followed this page https://docs.gcp.databricks.com/administration-guide/account-settings-gcp/log-delivery.html and have added log-delivery@databricks-prod-master.iam.gserviceaccount.co...
- 1151 Views
- 5 replies
- 2 kudos
Latest Reply
Hi @Md Tahseen Anam​ , We haven't heard from you on the last response from @Prabakar, and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please share it with the community as it can be helpful to others.A...
4 More Replies
- 773 Views
- 2 replies
- 2 kudos
For instance, I'm ingesting webhook data into a delta table with autoloader and need to run a process for each new record as it arrives.
- 773 Views
- 2 replies
- 2 kudos
Latest Reply
With autoloader, you can do something like changelog and record data about operations performed on each micro batch - like affected id, I/U/D, timestamp etc..Then you can make use of this changelog table, and run subsequent processes for each row aff...
1 More Replies
- 865 Views
- 4 replies
- 4 kudos
I want to know whether which cloud is better to learn and which cloud services has more career opportunities.
- 865 Views
- 4 replies
- 4 kudos
Latest Reply
Hi @ishant jain​ , We haven't heard from you on the last response from @me and @Cedric Law Hing Ping​​, and I was checking back to see if our solutions helped you. Or else, If you have any solution, please share it with the community as it can be hel...
3 More Replies