cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

bjjkkk
by New Contributor II
  • 3902 Views
  • 2 replies
  • 1 kudos

Getting 'No GCP Marketplace token provided' error while signing up from GCP marketplace.

Hey guys,I was trying to sign up to the 14 day free trial from GCP marketplace. When I click 'SIGN UP WITH DATABRICKS', I get the error below.HTTP ERROR 401Problem accessing /sign-up. Reason: No GCP Marketplace token provided. Please start over fr...

bjjkkk_0-1704807716840.png
  • 3902 Views
  • 2 replies
  • 1 kudos
Latest Reply
bjjkkk
New Contributor II
  • 1 kudos

Thanks Walter,I have the IAM permissions in place and also have a valid billing account.However, I keep getting the same error regarding the missing Marketplace token. I am clicking the 'SIGN UP WITH DATABRICKS' button from the GCP UI, so am not sure...

  • 1 kudos
1 More Replies
Rizaldy
by New Contributor II
  • 1815 Views
  • 2 replies
  • 0 kudos

HELP opening notebook displays blank, creating new one gives and error and other issues

Hi,SituationI just literally started using data bricks. I created a workspace, a cluster and uploaded a notebook. But my workspace doesn’t seem to function correctly at the moment.I will attach what it looks like when I try to open a notebookopening ...

Screenshot 2024-01-10 at 4.09.55 PM.png Screenshot 2024-01-10 at 4.24.16 PM.png Screenshot 2024-01-10 at 4.25.32 PM.png Screenshot 2024-01-10 at 4.39.13 PM.png
  • 1815 Views
  • 2 replies
  • 0 kudos
Latest Reply
Rizaldy
New Contributor II
  • 0 kudos

UPDATEI have downloaded chrome and this does not happen for it as well

  • 0 kudos
1 More Replies
aman_yadav007
by New Contributor
  • 1279 Views
  • 1 replies
  • 0 kudos

Databricks Widget

Hi,I was previously working databricks runtime 10.0 and now just upgraded to 13.0 runtime.I was using dashboard to display the widgets. Before it was just showing the widget label, but now it shows the widget name below it as well. Also it shows the ...

  • 1279 Views
  • 1 replies
  • 0 kudos
Latest Reply
SparkJun
Databricks Employee
  • 0 kudos

Hi @aman_yadav007, which widget type did you use? Can you please try a different widget type or check the widget type and its arguments from this example: https://docs.databricks.com/en/notebooks/widgets.html#databricks-widgets

  • 0 kudos
NP7
by New Contributor II
  • 3189 Views
  • 2 replies
  • 0 kudos

DLT pipeline unity catalog error

 Hi Everyone, I'm getting this error while running DLT pipeline in UCFailed to execute python command for notebook 'sample/delta_live_table_rules.py' with id RunnableCommandId(5596174851701821390) and error AnsiResult(,None, Map(), Map(),List(),List(...

  • 3189 Views
  • 2 replies
  • 0 kudos
Latest Reply
RaulValMob
New Contributor II
  • 0 kudos

I get a similar error, when there is a mistake in the @dlt.table() definition for a table. In my case the culprit is usually the path.

  • 0 kudos
1 More Replies
Bharathi-Rajen
by New Contributor II
  • 2221 Views
  • 2 replies
  • 0 kudos

Unable to migrate an empty parquet table to delta lake in Databricks

I'm trying to convert my Databricks Tables from Parquet to Delta. While most of the tables have data and are successfully converted to delta some of the empty parquet tables fail with an error message as below -CONVERT TO DELTA <schema-name>.parquet_...

  • 2221 Views
  • 2 replies
  • 0 kudos
Latest Reply
BR_DatabricksAI
Contributor III
  • 0 kudos

Hello Bharathi, Ideally the ETL job should not generate the empty parquet files in the respective location as it's an overhead to read the empty file and it's a not best practice.Assuming this can be easily fix in ETL job while getting the rows count...

  • 0 kudos
1 More Replies
marketing2
by New Contributor
  • 738 Views
  • 0 replies
  • 0 kudos

L'importance de Databricks dans le SEO

Le SEO est un domaine dynamique et complexe qui évolue constamment avec les technologies et les algorithmes de recherche. L'utilisation de Databricks, une plateforme d'analyse basée sur le cloud, a révolutionné la manière dont les spécialistes du SEO...

pexels-oleksandr-p-9822732.jpg
  • 738 Views
  • 0 replies
  • 0 kudos
Mado
by Valued Contributor II
  • 7445 Views
  • 2 replies
  • 0 kudos

Read CSV files in Azure Databricks notebook, how to read data when columns in CSV files are in the w

I have a task to revise CSV ingestion in Azure Databricks. The current implementation uses the below settings: source_query = ( spark.readStream.format("cloudFiles") .option("cloudFiles.format", "csv") .schema(defined_schema) .option(...

  • 7445 Views
  • 2 replies
  • 0 kudos
Latest Reply
Mado
Valued Contributor II
  • 0 kudos

 Also, I am looking for a solution that works with both correct files and malformed files using PySpark. 

  • 0 kudos
1 More Replies
nag_kanchan
by New Contributor III
  • 9964 Views
  • 4 replies
  • 0 kudos

Resolved! DatabaseError: (databricks.sql.exc.ServerOperationError) [UNBOUND_SQL_PARAMETER]

Hi,I am trying to connect my database through LLM and expecting to receive a description of the table and 1st 3 rows from the table.  from langchain.agents import create_sql_agent from langchain.agents.agent_toolkits import SQLDatabaseToolkit from la...

  • 9964 Views
  • 4 replies
  • 0 kudos
Latest Reply
nag_kanchan
New Contributor III
  • 0 kudos

This is not databricks issue but from langchain. A PR has been raised to solve this: One workaround that worked is: https://github.com/langchain-ai/langchain/issues/11068 setting sample_rows_in_table_info to 0 when calling SQLDatabase.from_databricks...

  • 0 kudos
3 More Replies
Sujitha
by Databricks Employee
  • 11395 Views
  • 2 replies
  • 1 kudos

Creating High Quality RAG Applications with Databricks

Retrieval-Augmented-Generation (RAG) has quickly emerged as a powerful way to incorporate proprietary, real-time data into Large Language Model (LLM) applications. Today we are excited to launch a suite of RAG tools to help Databricks users build hig...

Screenshot 2023-12-06 at 11.41.22 PM.png
  • 11395 Views
  • 2 replies
  • 1 kudos
Latest Reply
antsdispute
New Contributor II
  • 1 kudos

It seems like you're sharing an announcement or promotional content related to Databricks and their launch of a suite of tools for Retrieval-Augmented-Generation (RAG) applications. These tools are aimed at helping Databricks users build high-quality...

  • 1 kudos
1 More Replies
SethParker
by New Contributor III
  • 6088 Views
  • 2 replies
  • 1 kudos

Power BI Import Model Refresh from Databricks SQL Whse - Query has been timed out due to inactivity

We have an intermittant issue where occasionally a partition in our Power BI Import Dataset times out at 5 hours.  When I look at Query History in Databricks SQL, I see a query that failed with the following error message:  "Query has been timed out ...

  • 6088 Views
  • 2 replies
  • 1 kudos
Latest Reply
SethParker
New Contributor III
  • 1 kudos

The only solution we have been able to come up with was to create a Notebook in Databricks that uses the Power BI API to check the status of a Refresh.  We schedule it a bit after we expect the Refresh to complete.  If it is still running, we kill th...

  • 1 kudos
1 More Replies
Dlt
by New Contributor III
  • 13268 Views
  • 11 replies
  • 1 kudos

DLT Pipeline issue - Failed to read dataset .Dataset is not defined in the pipeline.

Background. I have created a DLT pipeline in which i am creating a Temorary table.  There are 5 temporary tables as such.  When i executed these in an independent notebook they all worked fine with DLT. Now i have merged this notebook ( keeping same ...

  • 13268 Views
  • 11 replies
  • 1 kudos
Latest Reply
Wojciech_BUK
Valued Contributor III
  • 1 kudos

I am sorry but information you are providing is not helping at all. Plase dump your code there.

  • 1 kudos
10 More Replies
dhrubg
by New Contributor
  • 7954 Views
  • 0 replies
  • 0 kudos

Data bricks for practice at no cost which cloud service or combination i need to use

Hi All Senior ,Context :I want to use databricks for practice to create projects and keep polishing my knowledge. My free credits are already used up . Now can you pls give me tips on how to run databricks in which cloud provider (storage account com...

  • 7954 Views
  • 0 replies
  • 0 kudos
AbhilashMV
by New Contributor II
  • 1756 Views
  • 0 replies
  • 0 kudos

Not able to download Certificate

Hi All,I took the course: Get Started With Data Engineering  from below course link https://www.databricks.com/learn/training/getting-started-with-data-engineering#data-videoBut, after completing the Quiz, I am not able to download Certificate. The a...

  • 1756 Views
  • 0 replies
  • 0 kudos
Sujitha
by Databricks Employee
  • 10035 Views
  • 2 replies
  • 0 kudos

Unlock Data Engineering Essentials in Just 90 Minutes - Get Certified for FREE!

There’s an increasing demand for data, analytics and AI talent in every industry. Start building your data engineering expertise with this self-paced course — and earn an industry-recognized Databricks certificate. This course provides four short tu...

2023-09-WB-Get-Started-With-Data-Engineering-comm-post-358x250-2x.png
  • 10035 Views
  • 2 replies
  • 0 kudos
Latest Reply
AbhilashMV
New Contributor II
  • 0 kudos

Same here. I am not able to download any Certificate even after passing the Quiz. But the Course link - https://www.databricks.com/learn/training/getting-started-with-data-engineering#data-videoclearly says: take a short knowledge test and earn a com...

  • 0 kudos
1 More Replies
tomcorbin
by New Contributor III
  • 2769 Views
  • 1 replies
  • 0 kudos

Resolved! Is it possible to pass a Spark session to other python files?

I am setting up pytest for my repo. I have my functions in separate python files and run pytest from one notebook. For each testing file, I have to create a new Spark session as follows:@pytest.fixture(scope="session")def spark():  spark = (  SparkSe...

  • 2769 Views
  • 1 replies
  • 0 kudos
Latest Reply
tomcorbin
New Contributor III
  • 0 kudos

I was able to do it by placing the Spark session fixture in the conftest.py file in the root directory. 

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels