- 14804 Views
- 2 replies
- 0 kudos
Resolved! Access Foreign Catalog using Python in Notebook
Hello - I have a foreign catalog which I can access fine in SQL. However, I can't access it from from python notebook.i.e. this works just fine if I have notebook using a Pro SQL Warehouse%sqlUSE CATALOG <my_foreign_catalog_name>;USE SCHEMA public;S...
- 14804 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi, Are you using this in single user cluster? Also, please tag @Debayan with your next response so that I will get notified.
- 0 kudos
- 2126 Views
- 1 replies
- 0 kudos
--files in spark submit task
Regarding --files option in spark submit task of Databricks jobs, would like to understand how it works and what is the syntax to pass multiple files to --files? I tried using --files and --py-files and my understanding is, it should make available t...
- 2126 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, could you please check if this helps: https://docs.databricks.com/en/files/index.html Also please tag @Debayan​ with your next response which will notify me, Thank you!
- 0 kudos
- 986 Views
- 1 replies
- 0 kudos
Hive Migration best practices
Hi ,Could you please share with us the approach and best practices for migrating from hadoop-hive to Databricks?Regards,Phanindra
- 986 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, You can try checking the below resources: https://www.databricks.com/resources/ebook/migration-guide-hadoop-to-databricks https://www.databricks.com/solutions/migration/hadoop https://www.databricks.com/blog/2021/08/06/5-key-steps-to-successfull...
- 0 kudos
- 1426 Views
- 1 replies
- 0 kudos
Sqoop Migration best practices
Hi ,Could you please share with us the approach and best practices for migrating from hadoop-SQOOP to Databricks?
- 1426 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, You can try checking the below resources: https://www.databricks.com/resources/ebook/migration-guide-hadoop-to-databricks https://www.databricks.com/solutions/migration/hadoop https://www.databricks.com/blog/2021/08/06/5-key-steps-to-successfull...
- 0 kudos
- 886 Views
- 1 replies
- 0 kudos
OOZE Jobs migration to databricks
Hi ,Could you please share with us the approach and best practices for migrating from hadoop-ooze jobs to Databricks?
- 886 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, You can try checking the below resources on Hadoop migration: https://www.databricks.com/resources/ebook/migration-guide-hadoop-to-databricks https://www.databricks.com/solutions/migration/hadoop https://www.databricks.com/blog/2021/08/06/5-key-...
- 0 kudos
- 1779 Views
- 1 replies
- 0 kudos
HDFS to Databricks
Hi ,Could you please share with us the approach and best practices for migrating from hadoop-HDFS to Databricks?
- 1779 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, You can try checking the below resources: https://www.databricks.com/resources/ebook/migration-guide-hadoop-to-databricks https://www.databricks.com/solutions/migration/hadoop https://www.databricks.com/blog/2021/08/06/5-key-steps-to-successfull...
- 0 kudos
- 1154 Views
- 1 replies
- 0 kudos
Notebook runs with error when run as a job
I am using a notebook to copy over my database on a schedule (I had no success connecting through the Data Explorer UI). When I run the notebook on its own, it works. When I run it as a scheduled job, I get this error. org.apache.spark.SparkSQLExcept...
- 1154 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, the error code is minimal, could you please post the whole error if that is possible? Also please tag @Debayan​ with your next response which will notify me, Thank you!
- 0 kudos
- 7819 Views
- 1 replies
- 0 kudos
Resolved! Recover Account Owner
Need help recovering account owner.Problem: Account owner cannot sign in with its password after SSO was configured. Account owner is a DL for team ownership so it doesn't have an AWS account and can't configure in AD group since it has "+" in the em...
- 7819 Views
- 1 replies
- 0 kudos
- 0 kudos
Resolved by temporarily disabling SSO with Active directory that wasn't allowing an email to be created with "+"
- 0 kudos
- 3034 Views
- 3 replies
- 1 kudos
DLT Pipeline unable to find custom Libraries/Wheel packages
We have our DLT pipeline and we need to import our custom libraries packaged in wheel files.We are on Azure DBX and we are using Az DevOps CI/CD to build and deploy the wheel packages on our DBX environment. In the top of our DLT notebook we are impo...
- 3034 Views
- 3 replies
- 1 kudos
- 1 kudos
"context_based_upload_for_execute": truein projects.json allowed the code to run - but ended withRuntimeError: Cannot start a remote Spark session because there is a regular Spark session already running.
- 1 kudos
- 2191 Views
- 1 replies
- 0 kudos
Will the cells in the notebook keep running even if the browser is closed?
The Execute ML Model Pipeline celll has finished running, it took 2.27 days to finish running. However, the code in the following cell called Process JSON Output needs to take a very long time again to run. Can I simply close the browser and shut dow...
- 2191 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, I have just tested it internally, even if the browser is closed the notebook keeps on running. You can start with a quick job to test it. Also, please tag @Debayan with your next response so that I will get notified.
- 0 kudos
- 1503 Views
- 1 replies
- 0 kudos
Is it a bug in DEEP CLONE?
Hi,I'm trying to modify a delta table using following approach:Shallow clone of the table (source_table)Modification of the the clone (clonned_table)Deep clone of the modified table to the source table.Source delta table has 26 752 rows. Current Delt...
- 1503 Views
- 1 replies
- 0 kudos
- 7861 Views
- 1 replies
- 1 kudos
Error Row Column access control Demo Unity Catalog
I have this error when I execute the Table ACL & Row and Column Level Security With Unity Catalog demo :Error in SQL statement: UnityCatalogServiceException: [RequestId=9644499b-3555-43d0-a2e7-0fdc29ea85ce ErrorClass=NOT_IMPLEMENTED.NOT_IMPLEMENTED] ...
- 7861 Views
- 1 replies
- 1 kudos
- 1179 Views
- 0 replies
- 0 kudos
Connecting to databricks ipython kernel from VSCode
I'd like to run python notebooks (.ipynb) from VSCode connecting to the ipython kernel from databricks. I have already connected to an execution cluster from VSCode and am able to run python scripts (.py files) and see the output on my local console....
- 1179 Views
- 0 replies
- 0 kudos
- 2775 Views
- 0 replies
- 0 kudos
Update values in DataFrame with values from another DataFrame
Hi, I have to data sources: "admin table" which contains active and inactive employees and I have "hr table" which contains only active employees.I need to update admin table with data from hr table:You can see that I need to update employee with num...
- 2775 Views
- 0 replies
- 0 kudos
- 3222 Views
- 6 replies
- 0 kudos
Need help migrating company customer and partner academy accounts to work properly
Hi, originally I accidentally made a customer academy account with my company that is a databricks partner. Then I made an account using my personal email and listed my company email as the partner email for the partner academy account. that account ...
- 3222 Views
- 6 replies
- 0 kudos
- 0 kudos
Hi @Maria_fed Thanks again, i have assigned your case to my colleague and you should hearing from them soon. Regards, Akshay
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
12.2 LST
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Analytics
1 -
Apache spark
1 -
API
2 -
API Documentation
2 -
Architecture
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
2 -
Azure data disk
1 -
Azure databricks
10 -
Azure Databricks SQL
5 -
Azure databricks workspace
1 -
Azure Unity Catalog
4 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Best Practices
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Bug
1 -
Catalog
1 -
Certification
1 -
Certification Exam
1 -
Certification Voucher
1 -
CICD
2 -
cleanroom
1 -
Cli
1 -
Cloud_files_state
1 -
cloudera sql
1 -
CloudFiles
1 -
Cluster
3 -
clusterpolicy
1 -
Code
1 -
Community Group
1 -
Community Social
1 -
Compute
2 -
conditional tasks
1 -
Connection
1 -
Cost
2 -
Credentials
1 -
CustomLibrary
1 -
CustomPythonPackage
1 -
DABs
1 -
Data Engineering
2 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
DataAISummit2023
1 -
DatabrickHive
1 -
databricks
2 -
Databricks Academy
1 -
Databricks Alerts
1 -
Databricks Audit Logs
1 -
Databricks Certified Associate Developer for Apache Spark
1 -
Databricks Cluster
1 -
Databricks Clusters
1 -
Databricks Community
1 -
Databricks connect
1 -
Databricks Dashboard
1 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Documentation
1 -
Databricks JDBC
1 -
Databricks Job
1 -
Databricks jobs
2 -
Databricks Lakehouse Platform
1 -
Databricks notebook
1 -
Databricks Notebooks
2 -
Databricks Platform
1 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks SQL
1 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
3 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
dbdemos
1 -
DBRuntime
1 -
DDL
1 -
deduplication
1 -
Delt Lake
1 -
Delta
13 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
6 -
Delta Sharing
2 -
deltaSharing
1 -
denodo
1 -
Deny assignment
1 -
Devops
1 -
DLT
9 -
DLT Pipeline
6 -
DLT Pipelines
5 -
DLTCluster
1 -
Documentation
2 -
Dolly
1 -
Download files
1 -
dropduplicatewithwatermark
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
1 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
Getting started
1 -
glob
1 -
Good Documentation
1 -
Google Bigquery
1 -
hdfs
1 -
Help
1 -
How to study Databricks
1 -
I have a table
1 -
informatica
1 -
Jar
1 -
Java
1 -
JDBC Connector
1 -
Job Cluster
1 -
Job Task
1 -
Kubernetes
1 -
LightGMB
1 -
Lineage
1 -
LLMs
1 -
Login
1 -
Login Account
1 -
Machine Learning
1 -
MachineLearning
1 -
masking
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Metastore
1 -
MlFlow
2 -
Mlops
1 -
Model Serving
1 -
Model Training
1 -
Mount
1 -
Networking
1 -
nic
1 -
Okta
1 -
ooze
1 -
os
1 -
Password
1 -
Permission
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
policies
1 -
PostgresSQL
1 -
Pricing
1 -
pubsub
1 -
Pyspark
1 -
Python
2 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
RBAC
1 -
Repos Support
1 -
Reserved VM's
1 -
Reset
1 -
run a job
1 -
runif
1 -
S3
1 -
SAP SUCCESS FACTOR
1 -
Schedule
1 -
SCIM
1 -
Serverless
1 -
Service principal
1 -
Session
1 -
Sign Up Issues
2 -
Significant Performance Difference
1 -
Spark
2 -
sparkui
2 -
Splunk
1 -
sqoop
1 -
Start
1 -
Stateful Stream Processing
1 -
Storage Optimization
1 -
Structured Streaming ForeachBatch
1 -
suggestion
1 -
Summit23
2 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
tabrikck
1 -
Tags
1 -
Troubleshooting
1 -
ucx
2 -
Unity Catalog
1 -
Unity Catalog Error
2 -
Unity Catalog Metastore
1 -
UntiyCatalog
1 -
Update
1 -
user groups
1 -
Venicold
3 -
volumes
2 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
with open
1 -
Women
1 -
Workflow
2 -
Workspace
2
- « Previous
- Next »