cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

phi_alpaca
by New Contributor III
  • 4221 Views
  • 8 replies
  • 6 kudos

Error at model serving for quantised models using bitsandbytes library

Hello,I've been trying to serve registered MLflow models at GPU Model Serving Endpoint, which works except for the models using bitsandbytes library. The library is used to quantise the LLM models into 4-bit/ 8-bit (e.g. Mistral-7B), however, it runs...

phi_alpaca_1-1708013174746.png
  • 4221 Views
  • 8 replies
  • 6 kudos
Latest Reply
G-M
Contributor
  • 6 kudos

@phi_alpacaWe have solved it by providing a conda_env.yaml when we log the model, all we needed was to add cudatoolkit=11.8 to the dependencies. 

  • 6 kudos
7 More Replies
yatharth
by New Contributor III
  • 1918 Views
  • 2 replies
  • 0 kudos

Databricks Job cost(AWS)

Hi Databricks Community,I am looking for a formula/way to calculate the estimated cost for a job run, for which I have a few questions:1. Is there any formula to calculate the cost of any job like -> [(EC2 per hr cost) * (total time job ran)]and when...

  • 1918 Views
  • 2 replies
  • 0 kudos
Latest Reply
yatharth
New Contributor III
  • 0 kudos

This looks a little bit confusing to me, I'm looking for a more straight forward answer, more like a simple formulaThanks though for your reply

  • 0 kudos
1 More Replies
thethirtyfour
by New Contributor III
  • 1194 Views
  • 0 replies
  • 0 kudos

sparklyr::spark_read_csv forbidden 403 error

Hi,I am trying to read a csv file into a Spark DataFrame using sparklyr::spark_read_csv. I am receiving a 403 access denied error.I have stored my AWS credentials as environment variables, and can successfully read the file as an R dataframe using ar...

  • 1194 Views
  • 0 replies
  • 0 kudos
dofre
by New Contributor II
  • 1059 Views
  • 0 replies
  • 0 kudos

Left Outer Join returns an Inner Join in Delta Live Tables

In our Delta Live Table pipeline I am simply joining two streaming tables to a new streaming table.We use the following code: @Dlt.create_table() def fact_event_faults(): events = dlt.read_stream('event_list').withWatermark('TimeStamp', '4 hours'...

4bba99f9-1293-42f9-ab53-f869af77a877.jpg
Community Platform Discussions
Delta Live Table
structured streaming
  • 1059 Views
  • 0 replies
  • 0 kudos
violar
by New Contributor
  • 4342 Views
  • 0 replies
  • 0 kudos

[Databricks][DatabricksJDBCDriver](500593) Communication link failure. Failed to connect to server.

I am using databricks jdbc driver to run a certain app. It runs fine for a few mins to hours and then I get the error [Databricks][DatabricksJDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 502,...

  • 4342 Views
  • 0 replies
  • 0 kudos
elgeo
by Valued Contributor II
  • 1767 Views
  • 1 replies
  • 2 kudos

Hide widgets logic

Hello,We have recently created a notebook in order to allow users inserting/updating values in specific tables. The logic behind the update statements is included in a separate notebook where users don't have access. However we would like to know if ...

  • 1767 Views
  • 1 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

When you want users to perform some write action (for example, change parameters, etc.), it is usually easiest to build a small app in Azure PowerApps, save those values, and extract them to the table in Delta Lake (so your notebooks will take values...

  • 2 kudos
FranPérez
by New Contributor III
  • 10615 Views
  • 1 replies
  • 1 kudos

Resolved! error loading databricks.proto

Hi,I'm using a DBR 13.3 LTS ML, and I want to set up a webhook trigger. I'm following the example notebook at https://learn.microsoft.com/en-us/azure/databricks/_extras/notebooks/source/mlflow/mlflow-model-registry-webhooks-python-client-example.html...

  • 10615 Views
  • 1 replies
  • 1 kudos
Latest Reply
StephanieAlba
Databricks Employee
  • 1 kudos

Hi @FranPérez , The issue is that databricks-registry-webhooks has a databricks.proto file that collides with mlflow. Here is the fix: %pip install databricks-registry-webhooks mlflow==2.2.2 I also posted the fix on StackOverflow: https://stackoverfl...

  • 1 kudos
Danny_Lee
by Valued Contributor
  • 3315 Views
  • 1 replies
  • 0 kudos

My Gamification

Hi all,I've been training at https://partner-academy.databricks.com/ and I see this tab for My Gamification, however, whenever I open it it always says 0 badges, 0 points.  I have completed a number of courses, but there's no change.  Is this feature...

Danny_Lee_0-1708225241683.png
  • 3315 Views
  • 1 replies
  • 0 kudos
Latest Reply
Danny_Lee
Valued Contributor
  • 0 kudos

Haven't received anything back on this and don't see any others with this issue, maybe its something on my side.  

  • 0 kudos
Axel_Schwanke
by Contributor
  • 722 Views
  • 0 replies
  • 0 kudos

Referencing a Technical Blog on LinkedIn - How to show the Blog Image instead of the Author Image?

When referencing a Technical Blog in a LinkedIn Post, the image of the author is displayed and not the image/picture of the blog itself - annoying.Example: Linkedin Post:    https://www.linkedin.com/posts/axelschwanke_star-struct-the-secret-life-of-t...

  • 722 Views
  • 0 replies
  • 0 kudos
nachog99
by New Contributor II
  • 1100 Views
  • 0 replies
  • 0 kudos

Read VCF files using latest runtime version

Hello everyone!I was reading VCF files using the glow library (Maven: io.projectglow:glow-spark3_2.12:1.2.1).The last version of this library only works with the spark's version 3.3.2 so if I need to use a newer runtime with a more recent spark versi...

  • 1100 Views
  • 0 replies
  • 0 kudos
Milliman
by New Contributor
  • 1614 Views
  • 1 replies
  • 0 kudos

How to add delay between databricks workflow job tasks.?

I want to add an explicit time delay between databricks workflow job tasks, any help would be greatly appreciated. Thanks

  • 1614 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 0 kudos

@Milliman - you could add min_retry_interval_millis to add delay between start of the failed run and the subsequent retry run.  Reference is here

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors