- 25040 Views
- 1 replies
- 0 kudos
Convert string date to date after changing format
Hi,I am using Data bricks SQL and came across a scenario. I have a date field whose dates are in format of 'YYYY-MM-DD'. I changed their format into 'MM/DD/YYYY' using DATE_FORMAT() function.EFF_DT = 2000-01-14 EFF_DT _2 = DATE_FORMAT(EFF_DT, 'MM/d...
- 25040 Views
- 1 replies
- 0 kudos
- 0 kudos
if you use to_date, you will get a date column as mentioned above.If you want to use the format MM/dd/yyyy you can use date_format but this will return a string column.In order to use Spark date functions, Date string should comply with Spark DateTyp...
- 0 kudos
- 7469 Views
- 0 replies
- 0 kudos
Workspace region
ERROR- Your workspace region is not yet supported for model serving, please see https://docs.databricks.com/machine-learning/model-serving/index.html#region-availability for a list of supported regions.The account is in ap-south-1. I can see there is...
- 7469 Views
- 0 replies
- 0 kudos
- 2112 Views
- 1 replies
- 0 kudos
How to install AWS .pem file in databricks cluster to make a db connection to MySql RDS
I am trying to make a connection between AWS Mysql RDS and Databricks. I am using the below code to establish the connection. But its failed due to certificate is not installed. I have the .pem file with me. Could anyone help on how install this in D...
- 2112 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, Could you please provide the error code or the full error stack? Please tag @Debayan with your next comment which will notify me. Thank you!
- 0 kudos
- 6705 Views
- 2 replies
- 1 kudos
Resolved! Download Dolly model on local machine
Hi~ I am new to LLM engineering, and am trying to download the Dolly-v2-7b model on local machine, so I don't need to connect to internet each time I am going to run the Dolly-v2-7b. Is it possible to do that? Thanks a lot!
- 6705 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi Kaniz and Sean, thanks for your responses and time.I was trying Kaniz's method, but got a reply from Sean, so I tried that too. I downloaded the file from the link Sean provided and saved it on my local machine, then used the code for Dollyv2 (htt...
- 1 kudos
- 13182 Views
- 5 replies
- 0 kudos
Python notebook crashes with "The Python kernel is unresponsive"
While using a Python notebook that works on my machine it crashes on the same point with the errors "The Python kernel is unresponsive" and "The Python process exited with exit code 134 (SIGABRT: Aborted).", but with no stacktrace for debugging the ...
- 13182 Views
- 5 replies
- 0 kudos
- 0 kudos
I am using the following DBR 12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12).Fatal error: The Python kernel is unresponsive.--------------------------------------------------------------------------- The Python process exited with exit code 134 (S...
- 0 kudos
- 4243 Views
- 2 replies
- 2 kudos
Resolved! Databrickscommunity reward store is not working
Hi Guys, Does anybody know when the Databricks community reward store portal will open?I see it's still under construction@Retired_mod @Sujitha
- 4243 Views
- 2 replies
- 2 kudos
- 2798 Views
- 2 replies
- 1 kudos
Databricks notebook issue
Hi,I'm trying to run ADF pipeline.However, it is getting fail at Notebook activity with below error.Error :NoSuchMethodError: com.microsoft.sqlserver.jdbc.SQLServerBulkCopy.writeToServer(Lcom/microsoft/sqlserver/jdbc/ISQLServerBulkRecord;)V I think i...
- 2798 Views
- 2 replies
- 1 kudos
- 1 kudos
@shan_chandra Thanks for your reply as per your suggetion changed Databricks version from 9.1LTS to 12.2LTSBut after change this when i check library which you provided(i.e com.microsoft.azure:spark-mssql-connector_2.12:1.3.0) under Maven it is not...
- 1 kudos
- 3864 Views
- 6 replies
- 0 kudos
How to close gke databricks cluster
I have on gke databricks cluster with multiple nodes running. I want to close nodes when not in use. To reduce cost usage. And Is there any way to pause gke cluster on-demand.
- 3864 Views
- 6 replies
- 0 kudos
- 0 kudos
Hi @DeveshKaushik ,If this is the case then create a support ticket with databricks
- 0 kudos
- 4095 Views
- 3 replies
- 0 kudos
unable to install pymqi in azure databricks
Hi,I am trying to install pymqi via below command:pip install pymqi However, I am getting below error message:Python interpreter will be restarted. Collecting pymqi Using cached pymqi-1.12.10.tar.gz (91 kB) Installing build dependencies: started Inst...
- 4095 Views
- 3 replies
- 0 kudos
- 0 kudos
I don't think so, because it won't be specific to Databricks - this is all a property of the third party packages. And, there are billions of possible library conflicts. But this is not an example of a package conflict. It's an example of not complet...
- 0 kudos
- 5824 Views
- 1 replies
- 1 kudos
Resolved! Configure job to use one cluster instance to multiple jobs
Hi! I have several tiny jobs that run in parallel and I want them to run on the same cluster:- Tasks type Python Script: I send the parameters this way to run the pyspark scripts.- Job compute cluster created as (copied JSON from Databricks Job UI)Ho...
- 5824 Views
- 1 replies
- 1 kudos
- 1 kudos
Unfortunately, running multiple jobs in parallel using a single job cluster is not supported (yet). New in databricks is the possibility to create a job that orchestrates multiple jobs. These jobs will however still use their own cluster (configurati...
- 1 kudos
- 1374 Views
- 1 replies
- 1 kudos
Is there a solution that we can display the worker types based on spark version selection using api?
Is there a solution that allows us to display the worker types or driver types based on the selection of Spark version using an api?
- 1374 Views
- 1 replies
- 1 kudos
- 1 kudos
Can you clarify what you mean? Worker and driver types are not related to Spark version.
- 1 kudos
- 2972 Views
- 2 replies
- 2 kudos
Resolved! Reduce EBS Default Volumes
By default Databricks creates 2 volumes: one with 30GB and the other one with 150GB. We have a lot of nodes in our pools and so a los of Terabytes of Volumes, but we are not making any use of them in the jobs. Is there any way to reduce the volumes? ...
- 2972 Views
- 2 replies
- 2 kudos
- 2 kudos
Yes, EBS vols are essential for shuffle spill for example. You are probably using them!
- 2 kudos
- 7356 Views
- 1 replies
- 0 kudos
Uninstalling a preinstalled python package from Databricks
[Datasets](https://pypi.org/project/datasets/) python package comes preinstalled on databricks clusters. I want to uninstall it or completely prevent it's installation when I create/start a cluster.I couldn't find any solution on stackoverflow.And I ...
- 7356 Views
- 1 replies
- 0 kudos
- 0 kudos
@Retired_mod note that you can't actually uninstall packages in the runtime with pip.
- 0 kudos
- 13295 Views
- 1 replies
- 0 kudos
Databricks cluster launch time
Hi Team,We have an @adf pipeline which will run some set of activities before #Azure databricks notebooks get called.As and when the notebooks are called our pipeline will launch a new cluster for every job with job compute as Standard F4 with a sing...
- 13295 Views
- 1 replies
- 0 kudos
- 2999 Views
- 1 replies
- 0 kudos
The job run failed because task dependency types are temporarily disabled
I am trying the recently released conditional tasks (https://docs.databricks.com/en/workflows/jobs/conditional-tasks.html). I have created a workflow where the leaf task depends on multiple tasks and its run_if property is set as AT_LEAST_ONE_SUCCESS...
- 2999 Views
- 1 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
15 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
BI Integrations
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
4 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Learning
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
2 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 120 | |
| 57 | |
| 42 | |
| 35 |