- 1614 Views
- 1 replies
- 2 kudos
Create new workbooks with code
Is is possible to create new notebooks from a notevbook in databricks? I have tried this code. But all of them are generic files, not notebooks.notebook_str = """# Databricks notebook source import pyspark.sql.functions as F import numpy as np # CO...
- 1614 Views
- 1 replies
- 2 kudos
- 2 kudos
Unfortunaly %run does not help me since I can't %run a .py file. I still need my code in notebooks.I am transpiling propriatary code to python using jinja templates. I would like to have the output as notebooks since those are most convenient to edit...
- 2 kudos
- 1629 Views
- 0 replies
- 0 kudos
informatica jobs from data bricks
Hi TeamHow can we call informatica jobs from data bricks? could you please suggest on this.Regards,Phanindra
- 1629 Views
- 0 replies
- 0 kudos
- 2442 Views
- 1 replies
- 0 kudos
Resolved! DLT Pipeline Graph is not detecting dependencies
Hi,This is my first databricks project. I am loading data from a UC external volume in ADLS into tables and then split one of the tables into two tables based on a column. When I create a pipeline, the tables don't have any dependencies and this is...
- 2442 Views
- 1 replies
- 0 kudos
- 0 kudos
While re-implementing my pipeline to publish to dev/test/prod instead of bronze/silver/gold, I think I found the answer. The downstream tables need to use the LIVE schema.
- 0 kudos
- 2211 Views
- 1 replies
- 1 kudos
Unpivoting data in live tables
I am loading data from CSV into live tables. I have a live delta table with data like this:WaterMeterID, ReadingDateTime1, ReadingValue1, ReadingDateTime2, ReadingValue2It needs to be unpivoted into this:WaterMeterID, ReadingDateTime1, ReadingValue1...
- 2211 Views
- 1 replies
- 1 kudos
- 940 Views
- 0 replies
- 0 kudos
How to suppress text output and just show the results in each cell?
How can I turn off displaying all this config text after running cells? I don't need to see all this text in every cell unless there is an error and it causes readability issues and makes the notebook scrolling tedious.
- 940 Views
- 0 replies
- 0 kudos
- 2243 Views
- 2 replies
- 0 kudos
time travel with DLT
Needed some help with Time Travel with Delta Live tables We were trying to figure out if we can go in and alter the history on this table, and what would happen to data that we mass upload? By this we mean we have data from the past that we would ...
- 2243 Views
- 2 replies
- 0 kudos
- 0 kudos
Delta Live Tables leverage Delta Lake, or Delta Tables. Delta tables, through transactions (e.g. insert, update, delete, merges, optimization) create versions of said Delta Table. Once a version is created it cannot be altered, it is immutable. Yo...
- 0 kudos
- 2078 Views
- 1 replies
- 0 kudos
Upgrade Spark version 3.2 to 3.4+
Hi Team,We would like to upgrade from Spark version 3.2 to 3.4+ (Databricks Runtime - 10.4 to 12.2/13.3)We would like to understand how complex upgradation is this and challenges which we face? what are the technical steps and precautions we need to ...
- 2078 Views
- 1 replies
- 0 kudos
- 1917 Views
- 0 replies
- 0 kudos
Customer Managed Keys in Databricks (AWS)
Hi Databricks Team,Could you please provide me the detailed steps on how to be enabled customer managed keys in databricks (AWS) Account, if there is any video on it that would be great helpful.Regards,Phanindra
- 1917 Views
- 0 replies
- 0 kudos
- 3809 Views
- 0 replies
- 0 kudos
Spark Streaming Issues while performing left join
Hi team,I'm struck in a Spark Structured streaming use-case.Requirement: To read two streaming data frames, perform a left join on it and display the results. Issue: While performing a left join, the resultant data frame contains only rows where ther...
- 3809 Views
- 0 replies
- 0 kudos
- 853 Views
- 0 replies
- 0 kudos
AI uses
Delve into the transformative realm of AI applications, where innovation merges seamlessly with technology's limitless possibilities.Explore the multifaceted landscape of AI uses and its dynamic impact on diverse industries at StackOfTuts.
- 853 Views
- 0 replies
- 0 kudos
- 1518 Views
- 1 replies
- 0 kudos
Load GCP data to Databricks using R
I'm working with Databricks and Google Cloud in the same project. I want to load specific datasets stored in GCP into a R notebook in Databricks. I currently can see the datasets in BigQuery. The problem is that using the sparklyr package, I'm not ab...
- 1518 Views
- 1 replies
- 0 kudos
- 2332 Views
- 0 replies
- 1 kudos
Best AI Art Generator
AI art generator uses artificial intelligence to create captivating artworks, redefining the boundaries of traditional creativity and enabling endless artistic possibilities.AI photo restoration is a groundbreaking technology that employs artificial ...
- 2332 Views
- 0 replies
- 1 kudos
- 2283 Views
- 1 replies
- 0 kudos
Multi Customer setup
We are trying to do POC to have shared resource like compute across multiple customer, Storage will be different, Is this possible ?
- 2283 Views
- 1 replies
- 0 kudos
- 6245 Views
- 0 replies
- 0 kudos
Alter table
Hi Team, Could you please suggest, Do we have an alternate approach to alter the table instead of creating a new table and copying the data as part of the deployment.
- 6245 Views
- 0 replies
- 0 kudos
- 3939 Views
- 1 replies
- 2 kudos
Stream failure JsonParseException
Hi all! I am having the following issue with a couple of pyspark streams. I have some notebooks running each of them an independent file structured streaming using delta bronze table (gzip parquet files) dumped from kinesis to S3 in a previous job....
- 3939 Views
- 1 replies
- 2 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
99 | |
52 | |
42 | |
30 |