cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

nachii_rajput
by New Contributor
  • 2056 Views
  • 0 replies
  • 0 kudos

Issue with Disabled "Repair DAG", "Repair All DAGs" Buttons in Airflow UI, functionality is working.

We are encountering an issue in the Airflow UI where the 'Repair DAG' and 'Repair All DAGs' options are disabled when a specific task fails. While the repair functionality itself is working properly (i.e., the DAGs can still be repaired through execu...

  • 2056 Views
  • 0 replies
  • 0 kudos
charliemerrell
by New Contributor
  • 562 Views
  • 2 replies
  • 0 kudos

Will auto loader read files if it doesn't need to?

I want to run auto loader on some very large json files. I don't actually care about the data inside the files, just the file paths of the blobs. If I do something like```    spark.readStream        .format("cloudFiles")        .option("cloudFiles.fo...

  • 562 Views
  • 2 replies
  • 0 kudos
Latest Reply
lingareddy_Alva
Honored Contributor III
  • 0 kudos

Hi @charliemerrell Yes, Databricks will still open and parse the JSON files, even if you're only selecting _metadata.It must infer schema and perform basic parsing, unless you explicitly avoid it.So, even if you do:.select("_metadata")It doesn't skip...

  • 0 kudos
1 More Replies
samgupta88
by New Contributor
  • 873 Views
  • 2 replies
  • 1 kudos

Enroll, Learn, Earn Databricks !!

Hello Team,I had attended the session in CTS Manyata on 22nd April. I am interested in pursuing for the certifications but while enrolling it shows you are not a member of any group.Link for the available certifications and courses: https://community...

  • 873 Views
  • 2 replies
  • 1 kudos
Latest Reply
Sujitha_admin
Databricks Employee
  • 1 kudos

Hi @samgupta88 you can find it on the partner academy. Everything is listed in the partner portal. 

  • 1 kudos
1 More Replies
Armaan-Sait
by New Contributor II
  • 715 Views
  • 4 replies
  • 0 kudos

UCX Installation error

Error Message: databricks.sdk.errors.platform.ResourceDoesNotExist: Can't find a cluster policy with id: 00127F76E005AE12.

  • 715 Views
  • 4 replies
  • 0 kudos
Latest Reply
mnorland
Valued Contributor
  • 0 kudos

Click into each policy in the Compute UI of the Workspace to see if the policy ID exists.  If it does, then the account that invoked the SDK method didn't have workspace admin permissions.

  • 0 kudos
3 More Replies
BigAlThePal
by New Contributor III
  • 1185 Views
  • 3 replies
  • 0 kudos

Resolved! .py file running stuck on waiting

Hello, hope you are doing well.We are facing an issue when running .py files. This is fairly recent and we were not experiencing this issue last week.As shown in the screenshots below, the .py file hangs on "waiting" after we press "run all". No matt...

BigAlThePal_0-1743709475328.png
  • 1185 Views
  • 3 replies
  • 0 kudos
Latest Reply
BigAlThePal
New Contributor III
  • 0 kudos

Hello, thanks a lot for your answer.We were getting the required permissions to use Firefox in our org, but in the meantime it seemed it worked again in Edge when it updated to version 135.0.3179.85 (Official build) (64-bit).

  • 0 kudos
2 More Replies
sujan1
by New Contributor II
  • 6511 Views
  • 2 replies
  • 2 kudos

Resolved! requirements.txt with cluster libraries

Cluster libraries are supported from version 15.0 - Databricks Runtime 15.0 | Databricks on AWS.How can I specify requirements.txt file path in the libraries in a job cluster in my workflow? Can I use relative path? Is it relative from the root of th...

  • 6511 Views
  • 2 replies
  • 2 kudos
Latest Reply
mishravk7250
New Contributor II
  • 2 kudos

how to install requirement.txt using github action.- name: Install workspace requirements.txt on clusterenv:CLUSTER_ID: ${{ secrets.DATABRICKS_CLUSTER_ID }}run: |databricks libraries install \--cluster-id "$CLUSTER_ID" \--whl "dbfs:/FileStore/enginee...

  • 2 kudos
1 More Replies
phguk
by New Contributor III
  • 1210 Views
  • 5 replies
  • 0 kudos

Debugging notebook access to external REST API

I'm using a Python Notebook with a REST API to access a system outside Databricks, in this case it's to call a SAS program. Identical python code works fine if I call it from jupyter on my laptop, but fails with a timeout when I run it from my Databr...

  • 1210 Views
  • 5 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

What happens if you run command in notebook:nc -vz hostname 443If it fails to connect this will mean that the firewall or security groups associated with the VPC or VNet are not allowing this connection, you will need to check with your networking te...

  • 0 kudos
4 More Replies
Dulce42
by New Contributor
  • 519 Views
  • 1 replies
  • 0 kudos

Trusted assets vs query examples

¡Hi community! In recent days I explored trusted assets in my genie space and this working very well! but I feel a little confused :sIn my genie space I have many queries examples when I create a new function with the same query example for verify th...

  • 519 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @Dulce42! It depends on your use case. If your function covers the scenario well, you don’t need a separate query example. Having both for the same purpose can create redundancy and make things more complex. Choose the option that best fits you...

  • 0 kudos
HaripriyaP
by New Contributor II
  • 824 Views
  • 2 replies
  • 0 kudos

Resolved! Need help to add personal email to databricks partner account

I have been actively using the Databricks Partner Academy for the past three years through my current organization. As I am planning to transition to a new company, I would like to ensure continued access to my training records and certifications.Cur...

  • 824 Views
  • 2 replies
  • 0 kudos
Latest Reply
HaripriyaP
New Contributor II
  • 0 kudos

Sure. Thank you!

  • 0 kudos
1 More Replies
Terje
by New Contributor
  • 714 Views
  • 1 replies
  • 0 kudos

Python versions - Notebooks and DBR

Hi,I have a problem with conflicting python versions in a notebook running with the Databricks 14 day free trial. One example:spark.conf.get("spark.databricks.clusterUsageTags.clusterName") # Returns: "Python versions in the Spark Connect client and...

  • 714 Views
  • 1 replies
  • 0 kudos
Latest Reply
Renu_
Valued Contributor II
  • 0 kudos

Hi @Terje, were you able to fix it? From what I know, during the free trial period we’re limited to the default setup, so version mismatches can’t be resolved unless we upgrade to a paid workspace.

  • 0 kudos
phguk
by New Contributor III
  • 1408 Views
  • 2 replies
  • 0 kudos

Python coding in notebook with a (long) token

I have written a python program (called by a trigger) that uses a token issued by a third party app (it's circa 400 bytes long including '.' and '-'). When I copy/paste this token into a Databricks notebook - curious formatting takes place and a coup...

  • 1408 Views
  • 2 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

Hey Paul, You can use databricks secrets for preserving the integrity of the token.Here's the databricks doc for refernece : https://docs.databricks.com/aws/en/security/secrets

  • 0 kudos
1 More Replies
dplaut
by New Contributor II
  • 4289 Views
  • 3 replies
  • 0 kudos

Save output of show table extended to table?

I want to save the output of     show table extended in catalogName like 'mysearchtext*';to a table.How do I do that?

  • 4289 Views
  • 3 replies
  • 0 kudos
Latest Reply
njoyb
New Contributor II
  • 0 kudos

Use  DESCRIBE EXTENDED customer AS JSON this returns as a json data  . This you can load Applicable to databricks 16.2 and abovehttps://docs.databricks.com/aws/en/sql/language-manual/sql-ref-syntax-aux-describe-table

  • 0 kudos
2 More Replies
BhavyaSreeBanga
by New Contributor
  • 4719 Views
  • 2 replies
  • 1 kudos

Missing Genie - Upload File Feature in Preview Section

Despite having admin privileges for both the workspace and Genie Workspace, we are unable to see the "Genie - Upload File" feature under the Preview section, even though the documentation indicates it should be available.We also attempted switching r...

  • 4719 Views
  • 2 replies
  • 1 kudos
Latest Reply
sridharplv
Valued Contributor II
  • 1 kudos

For more information around upload a file option please refer https://docs.databricks.com/aws/en/genie/file-uploadit supports csv and excel datasets as of now with condition that files must be smaller than 200 MB and contain fewer than 100 columns du...

  • 1 kudos
1 More Replies
abin-bcgov
by New Contributor III
  • 1691 Views
  • 4 replies
  • 4 kudos

Resolved! using Azure Databricks vs using Databricks directly

Hi friends,A quick question regarding how data, workspace controls works while using "Azure Databricks". I am planning to use Azure Databricks that comes as part of my employer's Azure Subscriptions. I work for a Public sector organization, which is ...

  • 1691 Views
  • 4 replies
  • 4 kudos
Latest Reply
abin-bcgov
New Contributor III
  • 4 kudos

Thanks a ton, @SP_6721 

  • 4 kudos
3 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels