- 4243 Views
- 3 replies
- 2 kudos
Hello Community Users, We recently announced a new Large Language Models (LLM) program, the first of its kind on edX! Learn how to develop production-ready LLM applications and dive into the theory behind foundation models. Taught by industry experts...
- 4243 Views
- 3 replies
- 2 kudos
Latest Reply
Hi @163050 You could download the Dbc file from the course, we already have the LLM course in the Customer Academy.
2 More Replies
- 2895 Views
- 2 replies
- 1 kudos
Hi everybody,I have a scenario where we have multiple teams working with Python and R, and this teams uses a lot of different libraries. Because of this dozen of libraries, the cluster start took much time. Then I created a Docker image, where I can ...
- 2895 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Fabio Simoes Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...
1 More Replies
by
ptawil
• New Contributor III
- 3225 Views
- 2 replies
- 4 kudos
Here is some model I created:class SomeModel(mlflow.pyfunc.PythonModel):
def predict(self, context, input):
# do fancy ML stuff
# log results
pandas_df = pd.DataFrame(...insert predictions here...)
spark_df = spark...
- 3225 Views
- 2 replies
- 4 kudos
Latest Reply
Any updates on this? I am running into the same issue@Patrick Tawil were you able to solve this problem? If so, do you mind sharing?
1 More Replies
- 11162 Views
- 3 replies
- 1 kudos
I saved an xgboost boost model in filetstore as a pkl file.I call the model by the commands belowmodel = pickle.load(open('/.../model.pkl', 'rb'))model.predict_proba(df[features])The model has been running for sometime with the above commands but I n...
- 11162 Views
- 3 replies
- 1 kudos
Latest Reply
Hi @Michael Okelola Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answe...
2 More Replies
- 1426 Views
- 1 replies
- 7 kudos
- 1426 Views
- 1 replies
- 7 kudos
Latest Reply
I got good knowledge by your post . It is very clear . Thank you . Keep sharing like this posts .It will be helpful
- 2524 Views
- 2 replies
- 5 kudos
Share information between tasks in a Databricks jobYou can use task values to pass arbitrary parameters between tasks in a Databricks job. You pass task values using the taskValues subutility in Databricks Utilities. The taskValues subutility provide...
- 2524 Views
- 2 replies
- 5 kudos
Latest Reply
We urgently hope for this feature, but to date, we have found that it is only available in Python. Do you have any plans to support Scala?
1 More Replies
- 2557 Views
- 3 replies
- 3 kudos
I am using Databricks AutoML ( Python SDK) to forecast bed occupancy. (Actually, Databricks used MLflow experiments for AutoML run). After training with different iterations, I registered the best model in the Databricks Model registry. Now I am tryi...
- 2557 Views
- 3 replies
- 3 kudos
Latest Reply
Hi, It can be a bug if the python version is 3.9.5 and still the error is on compatibility. Could you please raise a support case to look into it further?
2 More Replies
by
matte
• New Contributor III
- 13808 Views
- 7 replies
- 16 kudos
Hi everybody,I created a simple bayesian model using the pymc library in Python. I would like to graphically represent my model using the pymc.model_to_graphviz(model=model) method.However, it seems it does not work within a databrcks notebook, even ...
- 13808 Views
- 7 replies
- 16 kudos
- 9943 Views
- 7 replies
- 7 kudos
The help of `dbx sync` states that ```for the imports to work you need to update the Python path to include this target directory you're syncing to```This works quite well whenever the package is containing only driver-level functions. However, I ran...
- 9943 Views
- 7 replies
- 7 kudos
Latest Reply
Hi @Davide Cagnoni. Please see my answer to this post https://community.databricks.com/s/question/0D53f00001mUyh2CAC/limitations-with-udfs-wrapping-modules-imported-via-repos-filesI will copy it here for you:If your notebook is in the same Repo as t...
6 More Replies
- 8874 Views
- 4 replies
- 4 kudos
QuestionIt would be great if you could recommend how I go about solving the below problem. I haven't been able to find much help online. A. Background:A1. I have to text manipulation using python (like concatenation , convert to spacy doc , get verbs...
- 8874 Views
- 4 replies
- 4 kudos
Latest Reply
Hi @Krishna Zanwar Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Tha...
3 More Replies
by
hulma
• New Contributor II
- 3808 Views
- 4 replies
- 0 kudos
Hello, I tried to serve my model realtime. Model process keeps relaunching.I am getting this error in the logs, TypeError: Descriptors cannot not be created directly.
If this call came from a _pb2.py file, your generated code is out of date and must ...
- 3808 Views
- 4 replies
- 0 kudos
Latest Reply
Hey there @Hulma Abdul Rahman Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too.Cheers!
3 More Replies
by
Dhara
• New Contributor III
- 20811 Views
- 9 replies
- 5 kudos
Hi, I wanted to access multiple .mdb access files which are stored in the Azure Data Lake Storage(ADLS) or on Databricks File System using Python. Is it possible to guide me how can I achieve it? It would be great if you can share some code snippets ...
- 20811 Views
- 9 replies
- 5 kudos
Latest Reply
@Dhara Mandal Can you please try below?# cmd 1
%pip instal pandas_access
# cmd 2
import pandas_access as mdb
db_filename = '/dbfs/FileStore/Campaign_Template.mdb'
# Listing the tables.
for tbl in mdb.list_tables(db_filename):
print(tbl)
...
8 More Replies
by
Dhara
• New Contributor III
- 2332 Views
- 2 replies
- 0 kudos
Hi, I wanted to access multiple .mdb access files which are stored in the Azure Data Lake Storage(ADLS) or on Databricks File System using Python. Can you please help me by guiding how can I do it? It would be great if you can share some code snippet...
- 2332 Views
- 2 replies
- 0 kudos
Latest Reply
https://community.databricks.com/s/question/0D58Y00008rCmBySAK/access-multiple-mdb-files-using-pythonmyEHtrip Employee Login
1 More Replies
by
Vik1
• New Contributor II
- 4194 Views
- 4 replies
- 2 kudos
My setup:Worker type: Standard_D32d_v4, 128 GB Memory, 32 Cores, Min Workers: 2, Max Workers: 8Driver type: Standard_D32ds_v4, 128 GB Memory, 32 CoresDatabricks Runtime Version: 10.2 ML (includes Apache Spark 3.2.0, Scala 2.12)I ran a snowflake quer...
- 4194 Views
- 4 replies
- 2 kudos
Latest Reply
Hey there @Vivek Ranjan Checking in. If Joseph's answer helped, would you let us know and mark the answer as best? It would be really helpful for the other members to find the solution more quickly.Thanks!
3 More Replies