cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Science & Machine Learning

Forum Posts

Joseph_B
by Databricks Employee
  • 2671 Views
  • 1 replies
  • 0 kudos
  • 2671 Views
  • 1 replies
  • 0 kudos
Latest Reply
Joseph_B
Databricks Employee
  • 0 kudos

You can find the MLflow version in the runtime release notes, along with a list of every other library provided. E.g., for DBR 8.3 ML, you can look at the release notes for AWS, Azure, or GCP.The MLflow client API (i.e., the API provided by installi...

  • 0 kudos
User16826994223
by Honored Contributor III
  • 1748 Views
  • 1 replies
  • 0 kudos

Muliple Where condition vs AND && in Pyspark

.where((col('state')==state) & (col('month')>startmonth)I can do the where conditions both ways. I think the one below add readability. Is there any other difference and which is the best?.where(col('state')==state).where(col('month')>startmonth)

  • 1748 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16826994223
Honored Contributor III
  • 0 kudos

You can use explain to see what type of physical and logical plans are getting created . This is the best way to see difference , but as mentioned in the question , it should give the same physical plan

  • 0 kudos
User16788317466
by Databricks Employee
  • 1569 Views
  • 2 replies
  • 0 kudos

How do I efficiently read image data for a deep learning model?

How do I efficiently read image data for a deep learning model?

  • 1569 Views
  • 2 replies
  • 0 kudos
Latest Reply
Joseph_B
Databricks Employee
  • 0 kudos

Our documentation provides nice examples of preparing image data for training and inference.Training: See docs for AWS, Azure, GCPInference: See reference solution for AWS, Azure, GCP

  • 0 kudos
1 More Replies
User16789201666
by Databricks Employee
  • 2263 Views
  • 4 replies
  • 0 kudos

How do you control the cost of provisioning a cluster?

How do you govern the cost of running clusters in Databricks so you're not sticker shocked?

  • 2263 Views
  • 4 replies
  • 0 kudos
Latest Reply
User16826994223
Honored Contributor III
  • 0 kudos

Less use of Interactive cluster and more use of job cluster can one of the way above others

  • 0 kudos
3 More Replies
User16789201666
by Databricks Employee
  • 3063 Views
  • 1 replies
  • 0 kudos

When should we use offline store vs online store for Feature Store?

Looking at the docs we see both options, can we use both e.g.?

  • 3063 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16789201666
Databricks Employee
  • 0 kudos

Online store is for real time inferencing, in most case you will use the offline store.

  • 0 kudos
User16789201666
by Databricks Employee
  • 2334 Views
  • 2 replies
  • 0 kudos

What is Databricks' model deployment framework?

How do you do deploy a model in Databricks.

  • 2334 Views
  • 2 replies
  • 0 kudos
Latest Reply
RonanStokes_DB
Databricks Employee
  • 0 kudos

The following resources provide more detail on this:Databricks model registry example notebook: https://docs.databricks.com/_static/notebooks/mlflow/mlflow-model-registry-example.htmlDatabricks model lifecycle - https://docs.databricks.com/applicatio...

  • 0 kudos
1 More Replies
User16826992666
by Valued Contributor
  • 4853 Views
  • 1 replies
  • 0 kudos

Text length limitations in the display() function

Is there a way to change the limit to the length of strings that can be shown using the display() function in notebooks? If I'm noticing truncation, what can I do?

  • 4853 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16826992666
Valued Contributor
  • 0 kudos

There is a 500 character limit to strings in columns which is non-configurable. To see the full contents of the column, you can either use the tooltip to expand the cell or download the full results.

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels