Application Deployment in Marketplace
Hi,I want to deploy my flask application in Databricks Marketplace.How can I do it?Can you please share the details
- 1050 Views
- 0 replies
- 0 kudos
Hi,I want to deploy my flask application in Databricks Marketplace.How can I do it?Can you please share the details
If not, then I believe that it will be beneficial because the feature tables contain engineered features that its a good idea to document their calc logic for the benefit of other data scientists. Also, even non-engineered features are many times no...
I also would like to see support added for feature description get/set methods.
Our jobs have been running fine so far w/o any issues on a specific workspace. These jobs read data from files on Azure ADLS storage containers and dont use the hive metastore data at all.Now we attached the unity metastore to this workspace, created...
@Wojciech_BUK I granted both in the GUI:) you can either search for display name there (and then it uses the Managed Identity Object ID), or you can search directly for the value of the "Managed Identity Application ID" and then it works correctly! ...
Hi there,I'm reaching out for some assistance with importing JSON files into Databricks. Still a beginner even if I've gained experience working with various data import batches (CSV/JSON) for application monitoring: I'm currently facing a challenge...
Hi Team,My Databricks certified data engineer associate exam which was scheduled today and got suspended from the proctor side by raising some false alarms, from my end there was an internet disconnection issue for a couple of minutes. I was almost a...
Context:IDE: IntelliJ 2023.3.2Library: databricks-connect 13.3Python: 3.10Description:I develop notebooks and python scripts locally in the IDE and I connect to the spark cluster via databricks-connect for a better developer experience. I download a...
Late to the discussion, but I too was looking for a way to do this _programmatically_, as opposed to the UI.The solution I landed on was using the Python SDK (though you could assuredly do this using an API request instead if you're not in Python):w ...
The Unity migration guide (https://docs.databricks.com/en/data-governance/unity-catalog/migrate.html#before-you-begin) states the following:Unity Catalog manages partitions differently than Hive. Hive commands that directly manipulate partitions are ...
I am trying to connect to SQL through JDBC from databricks notebook. (Below is my notebook command)val df = spark.read.jdbc(jdbcUrl, "[MyTableName]", connectionProperties) println(df.schema)When I execute this command, with DBR 10.4 LTS it works fin...
Try to add the following parameters to your SQL connection string. It fixed my problem for 13.X and 12.X;trustServerCertificate=true;hostNameInCertificate=*.database.windows.net;
Hi Everyone,I am brand new to databricks and am setting up my first Semantic Model with RLS and have run into an unexpected problem.When I was testing my model with filters applied (where the RLS would handle later on) it runs extremely fast. I look...
Are you trying to use Power BI RLS rules on top of DirectQuery? Can you give an example of the rules you're trying to apply? Are they static roles, or dynamic roles based on the user's UPN/email being in the dataset?
Hi, I am a bit stumped atm bc I cannot figure out how to get a DLT table definition picked up in a Python notebook. 1. I created a new notebook in python2. added the following code: %python import dlt from pyspark.sql.functions import * @dlt.table(...
Ok, it seems that the default language of the notebook and the language of a particular cell can clash. If the default is set to Python, switching a cell to SQL won't work in DLT and vice versa. This is super unintuitive tbh.
Hi,We are using azure sql as external meta store. We are trying to access this external metastore from databricks warehouse clusters but getting error. `data` property must be defined in SQL query response . but we are able to connect to the same usi...
Hi , I am trying to create a Global Initscript using rest API as below successfully in the first step using Powershell. In the second step I am trying to enable it using rest api and getting the following error: Any guidance or help is appreciated. ...
This API use PATH method, but you use POSTPATCH /api/2.0/workspace-confhttps://docs.databricks.com/api/workspace/workspaceconf/setstatus
What is best practise to implement parameterization in SQL DLT (specifically) pipelines so that it's easy and no manual intervention would be potentially required to migrate from dev_region to prod_region
I would love to see a sample implementation of this config table.
com.databricks.backend.common.rpc.SparkDriverExceptions$SQLExecutionException: org.apache.spark.sql.connector.catalog.CatalogNotFoundException: Catalog 'uc-dev' plugin class not found: spark.sql.catalog.uc-dev is not defined ....I get the above when ...
I had the same error plugin class not found: spark.sql.catalog... is not defined immediately after attaching the workspace into Unity catalog.The error was resolved by restarting SQL Warehouse.It seems that if SQL Warehouse (or any cluster) is runnin...
Hi, I am facing a problem that I hope to get some help to understand. I have created a function that is supposed to check if the input data already exist in a saved delta table and if not, it should create some calculations and append the new data to...
Hi,im also having similar issue ..does creating temp view and reading it again after saving to a table works?? /
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now| User | Count |
|---|---|
| 1619 | |
| 790 | |
| 489 | |
| 349 | |
| 287 |