I am using databricks sql notebook to run these queries. I have a Python UDF like %python
from pyspark.sql.functions import udf
from pyspark.sql.types import StringType, DoubleType, DateType
def get_sell_price(sale_prices):
return sale_...
Hi @Vaibhav Gour​ , Just a friendly follow-up. Do you still need help, or @Werner Stinckens​'s response help you to find the solution? Please let us know.
Anyone know how to submit a support ticket? I keep getting into a loop that takes me back to the community page, but I need to submit an urgent ticket. I'm told our company pays a ridiculous sum for this feature yet it is impossible to find.Thanks ...
I'm new to databricks and try to migrate some former SQL code used so far, in a DB notebook.I have some constants declared and I can't find the right way to similarly declare those in my notebook.I tried %sql
DECLARE label_language CONSTANT VARCHAR(2...
Thank you @Hubert Dudek​ , gave it a try and it does the job, both in SQL and in Python btw. When the time will come to industrialize this, I'll have to figure out how to create/use/deal with some configuration files (json, yaml or so). But for explo...
The SQL API specification in the DLT docs shows an option for adding column comments when creating a table. Is there an equivalent way to do this when creating a DLT pipeline with a python notebook? The Python API specification in the DLT docs does n...
Hi @PJ Singh​ , The documentation says:The notebook results are stored in workspace system data storage, which is not accessible by users.But you can retrieve these results via UI, via get-output command of Jobs REST API, or via runs get-output comma...
I am studying databricks and I have an community edition account since November 19, 2021 and from December 22nd I am not able to login. "Invalid email address or password" error is thrown. When forgot password link is clicked no email is sent to regi...
Hi everyone, I am working with Databricks Notebooks and I am facing an issue with snowflake connector, I wanted to use DDL/DML with snowflake connector. Can someone please help me out with this, Thanks in advance !!
Hello,I am trying to run some geospatial transformations in Delta Live Table, using Apache Sedona.I tried defining a minimal example pipeline demonstrating the problem I encounter.First cell of my Notebook, I install apache-sedona Python package:%pip...
Hi @Nicolas Jean​ , Unfortunately, installing the third-pa/rty Java libraries is not yet supported for the Delta Live Tables, so you can't use Sedona with DLT right now.But suppose you're interested in the geospatial things on Databricks. You may loo...
I was try to make account in Databricks Community Edition, After finish my registration , i wait for email verification, But until now ..i didn't received email verification. And i try to make 2nd account with my another email. But it same, i didn't ...
Hi @Jehan Nargia​ , Thank you for reaching out!Please share the details over community@databricks.com.Let us look into this for you, and we'll check back in with an update.
I have followed the steps given here to parse .accdb files using ucanaccess on Azure Databricks, however, I receive errors.See below my code:# Connection properties
conn_properties = {"driver" : "net.ucanaccess.jdbc.UcanaccessDriver"}
# Path
url = ...
So I'm querying data from parquet files that have a couple of billions records (table 1 or t1), and then have to filter and then join with other parquet files with another couple of billions records (t2). This takes quite a long time to run (like 10h...
Your intuition about views is correct. Views are not materialized, so they are basically just a saved query. Every time you access a view it will have to be recomputed. This is certainly not ideal if it take a long time (like 10hrs) to materialize a ...
I'm mounting a Storage Account to a Databricks cluster in Azure. All the resources are included in a VNET and a private and public subnet has been associated to the Databricks resource. Below I've attached the guide we use for mounting the ADLS G2 to...
Hey there @Derrick Bakhuis​ Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too.Cheers!
@Kunal Gaurav​ ,ImportantThis feature is in Public Preview.To enable provisioning to Azure Databricks using Azure Active Directory (Azure AD), you must create an enterprise application for each Azure Databricks workspace.NoteThe way provisioning is c...