- 6 Views
- 0 replies
- 0 kudos
I using Notebooks to do some transformations I install a new whl: %pip install --force-reinstall /Workspace/<my_lib>.whl
%restart_python Then I successfully import the installed lib from my_lib.core import test However when I run my code with fo...
- 6 Views
- 0 replies
- 0 kudos
- 5744 Views
- 3 replies
- 1 kudos
I have a class in a python file like this from pyspark.sql import SparkSession
from pyspark.dbutils import DBUtils
class DatabricksUtils:
def __init__(self‌‌):
self.spark = SparkSession.getActiveSession()
self.dbutils = DBUtil...
- 5744 Views
- 3 replies
- 1 kudos
Latest Reply
Hi, we are also in the same exact situation. Were you able to solve the problem? Or a workaround maybe.
2 More Replies
by
Gutek
• New Contributor II
- 895 Views
- 3 replies
- 1 kudos
I'm trying to import a Lakeview Dashboard that I've originally exported through the CLI (version 0.213.0). The exported file has extension .lvdash.json and is a single line json file.I can't get it to work, I tried this command: databricks workspace ...
- 895 Views
- 3 replies
- 1 kudos
Latest Reply
Thanks for flagging. There should be enhanced API documentation specific to Lakeview in the next week or two (PR is in review). Keep an eye out for a page called "Use the Lakeview API and Workspace API to create and manage Lakeview dashboards."
Curre...
2 More Replies
- 51 Views
- 0 replies
- 0 kudos
I want to run a parametrized sql query in a task. Query: select * from {{client}}.catalog.table with client value being {{task.name}}.if client is a string parameter, it is replaced with quotes which throws an error.if table is a dropdown list parame...
- 51 Views
- 0 replies
- 0 kudos
- 1980 Views
- 5 replies
- 0 kudos
Hi, i am trying to pass catalog name as a parameter into query for sql task, and it pastes it with single quotes, which results in error. Is there a way to pass raw value or other possible workarounds? query:INSERT INTO {{ catalog }}.pas.product_snap...
- 1980 Views
- 5 replies
- 0 kudos
Latest Reply
@EdemSeitkh can you elaborate on your workaround? Curious how you were able to implement an enum paramter in DBSQL.I'm running into this same issue now.
4 More Replies
- 37 Views
- 1 replies
- 0 kudos
Hello,I have a remote azure sql warehouse serverless instance that I can access using databricks-sql-connector. I can read/write/update tables no problem.But, I'm also trying to read/write/update tables using local pyspark + jdbc drivers. But when I ...
- 37 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @amelia1 how are you?
What you got was indeed the top 5 rows (see that it was the Row class). What does it show when you run display(df)?
I'm thinking it might be something related to your schema, since you did not defined that, it can read the da...
by
RobinK
• New Contributor III
- 1757 Views
- 12 replies
- 11 kudos
Hello,since last night none of our ETL jobs in Databricks are running anymore, although we have not made any code changes.The identical jobs (deployed with Databricks asset bundles) run on an all-purpose cluster, but fail on a job cluster. We have no...
- 1757 Views
- 12 replies
- 11 kudos
Latest Reply
I do not believe this is solved, similar to a comment over here:https://community.databricks.com/t5/data-engineering/databrickssession-broken-for-15-1/td-p/70585We are also seeing this error in 14.3 LTS from a simple example:from pyspark.sql.function...
11 More Replies
by
TWib
• New Contributor III
- 1196 Views
- 7 replies
- 3 kudos
This code fails with exception:[NOT_COLUMN_OR_STR] Argument `col` should be a Column or str, got Column.File <command-4420517954891674>, line 7 4 spark = DatabricksSession.builder.getOrCreate() 6 df = spark.read.table("samples.nyctaxi.trips") ---->...
- 1196 Views
- 7 replies
- 3 kudos
Latest Reply
We are also seeing this error in 14.3 LTS from a simple example:from pyspark.sql.functions import coldf = spark.table('things')things = df.select(col('thing_id')).collect()[NOT_COLUMN_OR_STR] Argument `col` should be a Column or str, got Column.
6 More Replies
- 29 Views
- 0 replies
- 0 kudos
We use Databricks widgets in our python notebooks to pass parameters in jobs but also for when we are running the notebooks manually (outside of a job context) for various reasons. We're a small team, but I've noticed that when I create a notebook an...
- 29 Views
- 0 replies
- 0 kudos
- 41 Views
- 0 replies
- 0 kudos
I am trying to download course materials(.dbc file and the presentation slides) in Advanced Data Engineering with Databricks ID: E-VDG8QV. However I do not see those materials even when I scroll all the way down in that page. I have tried multiple br...
- 41 Views
- 0 replies
- 0 kudos
- 39 Views
- 0 replies
- 0 kudos
I noticed that on some Databricks 14.3 clusters, I get DataFrames with type pyspark.sql.connect.dataframe.DataFrame, while on other clusters also with Databricks 14.3, the exact same code gets DataFrames of type pyspark.sql.DataFramepyspark.sql.conne...
- 39 Views
- 0 replies
- 0 kudos
- 39542 Views
- 13 replies
- 4 kudos
The "Download CSV" button in the notebook seems to work only for results <=1000 entries. How can I export larger result-sets as CSV?
- 39542 Views
- 13 replies
- 4 kudos
Latest Reply
If you have a large dataset, you might want to export it to a bucket in parquet format from your notebook:%python
df = spark.sql("select * from your_table_name")
df.write.parquet(your_s3_path)
12 More Replies
- 661 Views
- 2 replies
- 0 kudos
Hello,I am trying to create a Job via Databricks SDK. As input, I use the JSON generated via Workflows UI (Worklflows->Jobs->View YAML/JSON->JSON API->Create) generating pavel_job.json. When trying to run SDK function jobs.create asdbk = WorkspaceCli...
- 661 Views
- 2 replies
- 0 kudos
Latest Reply
Hey there! I have been using Volumes to get the files. It looks like this:
dbk = WorkspaceClient(host=args.host, token=args.token, auth_type="pat")
file_path = "/Volumes/{{your_catalog}}/{{your_schema}}/json_volumes/sample1.json"
content = dbutils.f...
1 More Replies
by
Mits
• New Contributor II
- 1239 Views
- 4 replies
- 3 kudos
I am trying to send email alerts to a non databricks user. I am using Alerts feature available in SQL. Can someone help me with the steps.Do I first need to first add Notification Destination through Admin settings and then use this newly added desti...
- 1239 Views
- 4 replies
- 3 kudos
Latest Reply
Hi @Mitali Lad​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers yo...
3 More Replies
by
Phani1
• Valued Contributor
- 44 Views
- 1 replies
- 0 kudos
Hi Team, Could you please provide the details/process for integrating Azure Databricks - Unity Catalog and AAD? Regards,Phani
- 44 Views
- 1 replies
- 0 kudos
Latest Reply
Hello @Phani1 ,These doc pages might be useful for you:
Set up and manage Unity CatalogSync users and groups from Microsoft Entra ID