- 7391 Views
- 2 replies
- 1 kudos
Resolved! Download Dolly model on local machine
Hi~ I am new to LLM engineering, and am trying to download the Dolly-v2-7b model on local machine, so I don't need to connect to internet each time I am going to run the Dolly-v2-7b. Is it possible to do that? Thanks a lot!
- 7391 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi Kaniz and Sean, thanks for your responses and time.I was trying Kaniz's method, but got a reply from Sean, so I tried that too. I downloaded the file from the link Sean provided and saved it on my local machine, then used the code for Dollyv2 (htt...
- 1 kudos
- 15729 Views
- 5 replies
- 0 kudos
Python notebook crashes with "The Python kernel is unresponsive"
While using a Python notebook that works on my machine it crashes on the same point with the errors "The Python kernel is unresponsive" and "The Python process exited with exit code 134 (SIGABRT: Aborted).", but with no stacktrace for debugging the ...
- 15729 Views
- 5 replies
- 0 kudos
- 0 kudos
I am using the following DBR 12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12).Fatal error: The Python kernel is unresponsive.--------------------------------------------------------------------------- The Python process exited with exit code 134 (S...
- 0 kudos
- 4531 Views
- 2 replies
- 2 kudos
Resolved! Databrickscommunity reward store is not working
Hi Guys, Does anybody know when the Databricks community reward store portal will open?I see it's still under construction@Retired_mod @Sujitha
- 4531 Views
- 2 replies
- 2 kudos
- 3102 Views
- 2 replies
- 1 kudos
Databricks notebook issue
Hi,I'm trying to run ADF pipeline.However, it is getting fail at Notebook activity with below error.Error :NoSuchMethodError: com.microsoft.sqlserver.jdbc.SQLServerBulkCopy.writeToServer(Lcom/microsoft/sqlserver/jdbc/ISQLServerBulkRecord;)V I think i...
- 3102 Views
- 2 replies
- 1 kudos
- 1 kudos
@shan_chandra Thanks for your reply as per your suggetion changed Databricks version from 9.1LTS to 12.2LTSBut after change this when i check library which you provided(i.e com.microsoft.azure:spark-mssql-connector_2.12:1.3.0) under Maven it is not...
- 1 kudos
- 4359 Views
- 6 replies
- 0 kudos
How to close gke databricks cluster
I have on gke databricks cluster with multiple nodes running. I want to close nodes when not in use. To reduce cost usage. And Is there any way to pause gke cluster on-demand.
- 4359 Views
- 6 replies
- 0 kudos
- 0 kudos
Hi @DeveshKaushik ,If this is the case then create a support ticket with databricks
- 0 kudos
- 5074 Views
- 3 replies
- 0 kudos
unable to install pymqi in azure databricks
Hi,I am trying to install pymqi via below command:pip install pymqi However, I am getting below error message:Python interpreter will be restarted. Collecting pymqi Using cached pymqi-1.12.10.tar.gz (91 kB) Installing build dependencies: started Inst...
- 5074 Views
- 3 replies
- 0 kudos
- 0 kudos
I don't think so, because it won't be specific to Databricks - this is all a property of the third party packages. And, there are billions of possible library conflicts. But this is not an example of a package conflict. It's an example of not complet...
- 0 kudos
- 6385 Views
- 1 replies
- 1 kudos
Resolved! Configure job to use one cluster instance to multiple jobs
Hi! I have several tiny jobs that run in parallel and I want them to run on the same cluster:- Tasks type Python Script: I send the parameters this way to run the pyspark scripts.- Job compute cluster created as (copied JSON from Databricks Job UI)Ho...
- 6385 Views
- 1 replies
- 1 kudos
- 1 kudos
Unfortunately, running multiple jobs in parallel using a single job cluster is not supported (yet). New in databricks is the possibility to create a job that orchestrates multiple jobs. These jobs will however still use their own cluster (configurati...
- 1 kudos
- 1516 Views
- 1 replies
- 1 kudos
Is there a solution that we can display the worker types based on spark version selection using api?
Is there a solution that allows us to display the worker types or driver types based on the selection of Spark version using an api?
- 1516 Views
- 1 replies
- 1 kudos
- 1 kudos
Can you clarify what you mean? Worker and driver types are not related to Spark version.
- 1 kudos
- 3250 Views
- 2 replies
- 2 kudos
Resolved! Reduce EBS Default Volumes
By default Databricks creates 2 volumes: one with 30GB and the other one with 150GB. We have a lot of nodes in our pools and so a los of Terabytes of Volumes, but we are not making any use of them in the jobs. Is there any way to reduce the volumes? ...
- 3250 Views
- 2 replies
- 2 kudos
- 2 kudos
Yes, EBS vols are essential for shuffle spill for example. You are probably using them!
- 2 kudos
- 7781 Views
- 1 replies
- 0 kudos
Uninstalling a preinstalled python package from Databricks
[Datasets](https://pypi.org/project/datasets/) python package comes preinstalled on databricks clusters. I want to uninstall it or completely prevent it's installation when I create/start a cluster.I couldn't find any solution on stackoverflow.And I ...
- 7781 Views
- 1 replies
- 0 kudos
- 0 kudos
@Retired_mod note that you can't actually uninstall packages in the runtime with pip.
- 0 kudos
- 13654 Views
- 1 replies
- 0 kudos
Databricks cluster launch time
Hi Team,We have an @adf pipeline which will run some set of activities before #Azure databricks notebooks get called.As and when the notebooks are called our pipeline will launch a new cluster for every job with job compute as Standard F4 with a sing...
- 13654 Views
- 1 replies
- 0 kudos
- 3385 Views
- 1 replies
- 0 kudos
The job run failed because task dependency types are temporarily disabled
I am trying the recently released conditional tasks (https://docs.databricks.com/en/workflows/jobs/conditional-tasks.html). I have created a workflow where the leaf task depends on multiple tasks and its run_if property is set as AT_LEAST_ONE_SUCCESS...
- 3385 Views
- 1 replies
- 0 kudos
- 2993 Views
- 0 replies
- 0 kudos
change cloud provider from AWS to GOOGLE
I registered a Databricks account and selected using AWS as cloud provider, may I know how to change it to using Google? Thanks!
- 2993 Views
- 0 replies
- 0 kudos
- 5558 Views
- 2 replies
- 2 kudos
Resolved! com.databricks.NotebookExecutionException: FAILED
I am running the comparisons but I get an error, I am working from a databricks notebook.Could someone help me to solve the following error:com.databricks.WorkflowException: com.databricks.NotebookExecutionException: FAILED: Notebook not found: /user...
- 5558 Views
- 2 replies
- 2 kudos
- 2 kudos
two things that come to mind:1. the notebook resides on another path than '/users/cuenta_user/user/Tests'2. the notebook is not saved as a notebook but rather as an ordinary python file
- 2 kudos
- 2298 Views
- 0 replies
- 0 kudos
Databricks Assistant HIPPA? Future Cost?
With the Public Preview of Databricks Assistant, I have a few questions. 1) If the Azure Tenet is HIPPA compliant does that compliance also include the Databricks Assistant features? 2) Right now the product is free but what will the cost be? Will we...
- 2298 Views
- 0 replies
- 0 kudos
-
.CSV
1 -
Access Data
2 -
Access Databricks
3 -
Access Delta Tables
2 -
Account reset
1 -
adcAws databricks
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
2 -
AI
5 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
2 -
Application
2 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
Aws databricks
1 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
7 -
Azure data disk
1 -
Azure databricks
16 -
Azure Databricks Delta Table
1 -
Azure Databricks Job
1 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
6 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
CDC
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Comments
1 -
Community Edition
4 -
Community Edition Account
1 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
csv
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
4 -
Data Explorer
1 -
Data Governance
1 -
Data Ingestion & connectivity
1 -
Data Ingestion Architecture
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
4 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks Serverless
2 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks User Group
1 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
Delta Time Travel
1 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
DQX
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
Event Driven
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free Edition
1 -
Free trial
1 -
friendsofcommunity
1 -
GCP Databricks
1 -
GenAI
2 -
GenAI and LLMs
1 -
GenAI Course Material
1 -
Getting started
3 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
2 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
JSON Object
1 -
LakeflowDesigner
1 -
Learning
2 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
2 -
Metadata
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model
1 -
Model Serving
1 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
2 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
provisioned throughput
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Sant
1 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Speakers
1 -
Splunk
2 -
SQL
8 -
streamlit
1 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
3 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
2 -
Venicold
3 -
Vnet
1 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 140 | |
| 135 | |
| 57 | |
| 46 | |
| 42 |