by
deng77
• New Contributor III
- 20287 Views
- 11 replies
- 2 kudos
I want to add a column to an existing delta table with a timestamp for when the data was inserted. I know I can do this by including current_timestamp with my SQL statement that inserts into the table. Is it possible to add a column to an existing de...
- 20287 Views
- 11 replies
- 2 kudos
Latest Reply
Can you please provide information on the additional expenses related to using this feature compared to not utilizing it at all?
10 More Replies
- 2966 Views
- 5 replies
- 1 kudos
Im looking at using Databricks internally for some Data Science projects. I am however very confused to how the pricing works and would like to obviously avoid high spending right now. Internal documentation and within Databricks All-Purpose Compute...
- 2966 Views
- 5 replies
- 1 kudos
Latest Reply
Hello,I was able to get a very precise cost of Azure Databricks Clusters and Computers jobs, using the Microsoft API and Databricks APIThen I wrote a simple tool to extract and manipulate the API results and generate detailed cost reports that can be...
4 More Replies
- 670 Views
- 3 replies
- 1 kudos
I have a date range picker filter in Databricks Lakeview Dashboard, so when i open dashboard there is no date selected and i want to set a default date. Is that possible with lakeview dashboard filters?
- 670 Views
- 3 replies
- 1 kudos
Latest Reply
Hi there - We're working on default filter values actively and that will help here. For now, when you change filter values you'll notice the URL changes. You can always bookmark the URL or share that modified one with other and when they open it, t...
2 More Replies
by
Gutek
• New Contributor II
- 924 Views
- 4 replies
- 1 kudos
I'm trying to import a Lakeview Dashboard that I've originally exported through the CLI (version 0.213.0). The exported file has extension .lvdash.json and is a single line json file.I can't get it to work, I tried this command: databricks workspace ...
- 924 Views
- 4 replies
- 1 kudos
Latest Reply
Glad you've got everything up and running!
3 More Replies
- 33 Views
- 0 replies
- 0 kudos
I am unable to execute update statements through Databricks Notebook, getting this error message "com.databricks.sql.transaction.tahoe.actions.InvalidProtocolVersionException: Delta protocol version is too new for this version of the Databricks Runti...
- 33 Views
- 0 replies
- 0 kudos
- 7793 Views
- 5 replies
- 5 kudos
driver_manager = spark._sc._gateway.jvm.java.sql.DriverManager
connection = driver_manager.getConnection(mssql_url, mssql_user, mssql_pass)
connection.prepareCall("EXEC sys.sp_tables").execute()
connection.close()The above code works fine but however...
- 7793 Views
- 5 replies
- 5 kudos
Latest Reply
judyy
New Contributor III
This blog helped me with the output of the stored procedure: https://medium.com/@judy3.yang/how-to-run-sql-procedure-in-databricks-notebook-e28023555565
4 More Replies
- 22 Views
- 0 replies
- 0 kudos
I using Notebooks to do some transformations I install a new whl: %pip install --force-reinstall /Workspace/<my_lib>.whl
%restart_python Then I successfully import the installed lib from my_lib.core import test However when I run my code with fo...
- 22 Views
- 0 replies
- 0 kudos
- 5775 Views
- 3 replies
- 1 kudos
I have a class in a python file like this from pyspark.sql import SparkSession
from pyspark.dbutils import DBUtils
class DatabricksUtils:
def __init__(self‌‌):
self.spark = SparkSession.getActiveSession()
self.dbutils = DBUtil...
- 5775 Views
- 3 replies
- 1 kudos
Latest Reply
Hi, we are also in the same exact situation. Were you able to solve the problem? Or a workaround maybe.
2 More Replies
- 63 Views
- 0 replies
- 0 kudos
I want to run a parametrized sql query in a task. Query: select * from {{client}}.catalog.table with client value being {{task.name}}.if client is a string parameter, it is replaced with quotes which throws an error.if table is a dropdown list parame...
- 63 Views
- 0 replies
- 0 kudos
- 1994 Views
- 5 replies
- 0 kudos
Hi, i am trying to pass catalog name as a parameter into query for sql task, and it pastes it with single quotes, which results in error. Is there a way to pass raw value or other possible workarounds? query:INSERT INTO {{ catalog }}.pas.product_snap...
- 1994 Views
- 5 replies
- 0 kudos
Latest Reply
@EdemSeitkh can you elaborate on your workaround? Curious how you were able to implement an enum paramter in DBSQL.I'm running into this same issue now.
4 More Replies
- 51 Views
- 1 replies
- 0 kudos
Hello,I have a remote azure sql warehouse serverless instance that I can access using databricks-sql-connector. I can read/write/update tables no problem.But, I'm also trying to read/write/update tables using local pyspark + jdbc drivers. But when I ...
- 51 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @amelia1 how are you?
What you got was indeed the top 5 rows (see that it was the Row class). What does it show when you run display(df)?
I'm thinking it might be something related to your schema, since you did not defined that, it can read the da...
by
RobinK
• New Contributor III
- 1896 Views
- 12 replies
- 11 kudos
Hello,since last night none of our ETL jobs in Databricks are running anymore, although we have not made any code changes.The identical jobs (deployed with Databricks asset bundles) run on an all-purpose cluster, but fail on a job cluster. We have no...
- 1896 Views
- 12 replies
- 11 kudos
Latest Reply
I do not believe this is solved, similar to a comment over here:https://community.databricks.com/t5/data-engineering/databrickssession-broken-for-15-1/td-p/70585We are also seeing this error in 14.3 LTS from a simple example:from pyspark.sql.function...
11 More Replies
by
TWib
• New Contributor III
- 1235 Views
- 7 replies
- 3 kudos
This code fails with exception:[NOT_COLUMN_OR_STR] Argument `col` should be a Column or str, got Column.File <command-4420517954891674>, line 7 4 spark = DatabricksSession.builder.getOrCreate() 6 df = spark.read.table("samples.nyctaxi.trips") ---->...
- 1235 Views
- 7 replies
- 3 kudos
Latest Reply
We are also seeing this error in 14.3 LTS from a simple example:from pyspark.sql.functions import coldf = spark.table('things')things = df.select(col('thing_id')).collect()[NOT_COLUMN_OR_STR] Argument `col` should be a Column or str, got Column.
6 More Replies
- 43 Views
- 0 replies
- 0 kudos
We use Databricks widgets in our python notebooks to pass parameters in jobs but also for when we are running the notebooks manually (outside of a job context) for various reasons. We're a small team, but I've noticed that when I create a notebook an...
- 43 Views
- 0 replies
- 0 kudos
- 49 Views
- 0 replies
- 0 kudos
I am trying to download course materials(.dbc file and the presentation slides) in Advanced Data Engineering with Databricks ID: E-VDG8QV. However I do not see those materials even when I scroll all the way down in that page. I have tried multiple br...
- 49 Views
- 0 replies
- 0 kudos