cancel
Showing results for 
Search instead for 
Did you mean: 
Page Title

Welcome to the Databricks Community

Discover the latest insights, collaborate with peers, get help from experts and make meaningful connections

102387members
52749posts
cancel
Showing results for 
Search instead for 
Did you mean: 
Registration now open! Databricks Data + AI Summit 2024

Join tens of thousands of data leaders, engineers, scientists and architects from around the world at Moscone Center in San Francisco, June 10–13.  Explore the latest advances in Apache Spark™, Delta Lake, MLflow, LangChain, PyTorch, dbt, Prest...

  • 7641 Views
  • 1 replies
  • 4 kudos
02-12-2024
Meet DBRX, the New Standard for High-Quality LLMs

Get your first look at DBRX April 25, 2024 | 8 AM PT If you’re using off-the-shelf LLMs to build GenAI applications, you’re probably struggling with quality, privacy and governance issues. What you need is a way to cost-effectively build a custom LLM...

  • 2298 Views
  • 3 replies
  • 2 kudos
2 weeks ago
Data Warehousing in the Era of AI

AI has the power to address the data warehouse’s biggest challenges — performance, governance and usability — thanks to its deeper understanding of your data and how it’s used. This is data intelligence and it’s revolutionizing the way you query, man...

  • 2983 Views
  • 5 replies
  • 1 kudos
2 weeks ago

Community Activity

Kaizen
by Contributor III
  • 8 Views
  • 3 replies
  • 0 kudos

Databricks Points and Community Rewards Store

Hi! Oddly I wasnt able to find any information online about the following. Would expect there to be more documentation or a blog post laying it all out.1) Where the points accrued on the Databricks profile is listed?2) What website/link do we navigat...

  • 8 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaizen
Contributor III
  • 0 kudos

oh no! I was actually really excited when i heard about the reward system  

  • 0 kudos
2 More Replies
yogu
by Honored Contributor III
  • 1704 Views
  • 8 replies
  • 73 kudos

Trying to claim reward points but its not reflecting my points

Hi Team,Can anyone help me why my reward point still showing 0 balance. My databricks community points is not reflecting on reward claim portal.i was login first time . Also I was wait for 3 business days but its still not reflecting

image
  • 1704 Views
  • 8 replies
  • 73 kudos
Latest Reply
Kaizen
Contributor III
  • 73 kudos

Can you also share the link for the reward points redemption?

  • 73 kudos
7 More Replies
Kaizen
by Contributor III
  • 5 Views
  • 0 replies
  • 0 kudos

Command to display all computes available in your workspace

Hi Is there a command you could use to list all computes configured in your workspace (active and non-active).  This would be really helpful for anyone managing the platfrom to pull all the meta data (tags ,etc) and quickly evaluate all the configura...

  • 5 Views
  • 0 replies
  • 0 kudos
Shreyash
by New Contributor
  • 129 Views
  • 4 replies
  • 0 kudos

java.lang.ClassNotFoundException: com.johnsnowlabs.nlp.DocumentAssembler

I am trying to serve a pyspark model using an endpoint. I was able to load and register the model normally. I could also load that model and perform inference but while serving the model, I am getting the following error: [94fffqts54] ERROR StatusLog...

Machine Learning
Model serving
sparknlp
  • 129 Views
  • 4 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Shreyash, It looks like your code is encountering a java.lang.ClassNotFoundException for the com.johnsnowlabs.nlp.DocumentAssembler class while serving your PySpark model. This error occurs when the required class is not found in the classpath.  ...

  • 0 kudos
3 More Replies
RakeshRakesh_De
by New Contributor III
  • 156 Views
  • 6 replies
  • 0 kudos

Spark CSV file read option to read blank/empty value from file as empty value only instead Null

Hi,I am trying to read one file which having some blank value in column and we know spark convert blank value to null value during reading, how to read blank/empty value as empty value ?? tried DBR 13.2,14.3I have tried all possible way but its not w...

RakeshRakesh_De_0-1713431921922.png
Data Engineering
csv
EmptyValue
FileRead
  • 156 Views
  • 6 replies
  • 0 kudos
Latest Reply
RakeshRakesh_De
New Contributor III
  • 0 kudos

dont quote something from stackoverflow because those are old version in spark tried.. have you tried the thing on your own to verify if this really working or not in spark3??

  • 0 kudos
5 More Replies
Sujitha
by Community Manager
  • 2942 Views
  • 2 replies
  • 0 kudos

�� Exciting Changes to the Databricks Community + Recognition System!  We have some fantastic news to share with you! In the coming week, we wi...

Exciting Changes to the Databricks Community + Recognition System! We have some fantastic news to share with you! In the coming week, we will be launching a fully revamped Databricks Community, aiming to enhance your overall experience and introduce...

  • 2942 Views
  • 2 replies
  • 0 kudos
Latest Reply
Shalabh007
Honored Contributor
  • 0 kudos

@Sujitha is Databricks Community Rewards Store relaunched ??

  • 0 kudos
1 More Replies
depp
by Visitor
  • 90 Views
  • 1 replies
  • 0 kudos

Databricks Voucher

Hi Folks,I am looking for a free voucher to take up the "Data Analyst Associate certification". Could anyone guide please ? 

  • 90 Views
  • 1 replies
  • 0 kudos
Latest Reply
Cert-TeamOPS
New Contributor
  • 0 kudos

@depp Apologies we currently don't have any promotion for discount voucher to offer. But here some event which might can help you https://www.databricks.com/lp/data-intelligence-days. 

  • 0 kudos
Surajv
by New Contributor III
  • 93 Views
  • 1 replies
  • 0 kudos

Getting client.session.cache.size warning in pyspark code using databricks connect

Hi Community, I have setup a jupyter notebook in a server and installed databricks connect in its kernel to leverage my databricks cluster compute in the notebook and write pyspark code. Whenever I run my code it gives me below warning: ```WARN Spark...

  • 93 Views
  • 1 replies
  • 0 kudos
Latest Reply
Riyakh
New Contributor II
  • 0 kudos

The warning indicates that the client cache (used to manage connections between your local environment and the Databricks cluster) has reached its maximum size (20 sessions). When this limit is reached, the oldest session is closed to make room for a...

  • 0 kudos
cszczotka
by New Contributor II
  • 64 Views
  • 0 replies
  • 0 kudos

Ephemeral storage how to create/mount.

Hi,I'm looking for information how to create/mount ephemeral storage to Databricks driver node in Azure Cloud.  Does anyone have any experience working with ephemeral storage?Thanks,

  • 64 Views
  • 0 replies
  • 0 kudos
Dom1
by New Contributor
  • 159 Views
  • 2 replies
  • 0 kudos

Show log4j messages in run output

Hi,I have an issue when running JAR jobs. I expect to see logs in the output window of a run. Unfortunately, I can only see messages of that are generated with "System.out.println" or "System.err.println". Everything that is logged via slf4j is only ...

Dom1_0-1713189014582.png
  • 159 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Dom1,  Ensure that both the slf4j-api and exactly one implementation binding (such as slf4j-simple, logback, or another compatible library) are present in your classpath1.If you’re developing a library, it’s recommended to depend only on slf4j-ap...

  • 0 kudos
1 More Replies
drag7ter
by New Contributor
  • 22 Views
  • 0 replies
  • 0 kudos

Configure Service Principle access to GiLab

I'm facing an issue while trying to run my job in db and my notebooks located in Git Lab. When I run job under my personal user_Id it works fine, because I added Git Lab token to my user_Id profile and job able to pull branch from repository. But whe...

  • 22 Views
  • 0 replies
  • 0 kudos
Surajv
by New Contributor III
  • 18 Views
  • 0 replies
  • 0 kudos

Getting python version errors when using pyspark rdd using databricks connect

Hi community, When I use pyspark rdd related functions in my environment using databricks connect, I get below error: Databricks cluster version: 12.2. `RuntimeError: Python in worker has different version 3.9 than that in driver 3.10, PySpark cannot...

  • 18 Views
  • 0 replies
  • 0 kudos
174817
by New Contributor II
  • 55 Views
  • 2 replies
  • 0 kudos

DataBricks Rust client and/or OpenAPI spec

Hi,I'm looking for a DataBricks client for Rust.  I could only find these SDK implementations.Alternatively, I would be very happy with the OpenAPI spec.  Clearly one exists: the Go SDK implementation contains code to generate itself from such a spec...

Data Engineering
openapi
rust
sdk
unity
  • 55 Views
  • 2 replies
  • 0 kudos
Latest Reply
feiyun0112
Contributor
  • 0 kudos

Databricks REST API referenceThis reference contains information about the Databricks application programming interfaces (APIs). Each API reference page is presented primarily from a representational state transfer (REST) perspective. Databricks REST...

  • 0 kudos
1 More Replies
liormayn
by Visitor
  • 107 Views
  • 0 replies
  • 0 kudos

OSError: [Errno 78] Remote address changed

Hello:)as part of deploying an app that previously ran directly on emr to databricks, we are running experiments using LTS 9.1, and getting the following error: PythonException: An exception was thrown from a UDF: 'pyspark.serializers.SerializationEr...

  • 107 Views
  • 0 replies
  • 0 kudos
liormayn
by Visitor
  • 32 Views
  • 0 replies
  • 0 kudos

Error while encoding: java.lang.RuntimeException: org.apache.spark.sql.catalyst.util.GenericArrayDa

Hello:)we are trying to run an existing working flow that works currently on EMR, on databricks.we use LTS 10.4, and when loading the data we get the following error:at org.apache.spark.api.python.BasePythonRunner$WriterThread.run(PythonRunner.scala:...

  • 32 Views
  • 0 replies
  • 0 kudos

Latest from our Blog

Attributing Costs in Databricks Model Serving

Databricks Model Serving provides a scalable, low-latency hosting service for AI models. It supports models ranging from small custom models to best-in-class large language models (LLMs). In this blog...

2503Views 1kudos

MLOps Gym - Unity Catalog Setup for MLOps

Unity Catalog (UC) is Databricks unified governance solution for all data and AI assets on the Data Intelligence Platform. UC is central to implementing MLOps on Databricks as it is where all your as...

2778Views 1kudos