cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

TestuserAva
by New Contributor II
  • 3089 Views
  • 7 replies
  • 2 kudos

Getting HTML sign I page as api response from databricks api with statuscode 200

Response:<!doctype html><html><head>    <meta charset="utf-8" />    <meta http-equiv="Content-Language" content="en" />    <title>Databricks - Sign In</title>    <meta name="viewport" content="width=960" />    <link rel="icon" type="image/png" href="...

TestuserAva_0-1701165195616.png
  • 3089 Views
  • 7 replies
  • 2 kudos
Latest Reply
SJR
New Contributor III
  • 2 kudos

Hello @Abhishek10745 It was just like you said! We have a completely private instance of Databricks and the DevOps Pipeline that I was using didin't have access to the private vnet. Switching pools solved the problem. Thanks for all the help!

  • 2 kudos
6 More Replies
B_J_Innov
by New Contributor III
  • 1969 Views
  • 2 replies
  • 0 kudos

Make API Call to run job

Hi everyone,I want to trigger a run for a job using API Call.Here's my code"""import shleximport subprocessdef call_curl(curl):args = shlex.split(curl)process = subprocess.Popen(args, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE)stdout...

  • 1969 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @B_J_Innov, To resolve the "Unauthorized" error when triggering a Databricks job via the API, ensure you authenticate using a personal access token (PAT) and include it as the `Bearer` token in your request headers. Verify that your API endpoint i...

  • 0 kudos
1 More Replies
BillGuyTheScien
by New Contributor II
  • 1661 Views
  • 1 replies
  • 0 kudos

Resolved! combining accounts

I have an AWS based databricks account with a few workspaces and an Azure Databricks workspace.  How do I combine them into one account?I am particularly interested in setting up a single billing drop with all my Databricks costs.  

  • 1661 Views
  • 1 replies
  • 0 kudos
Latest Reply
AlliaKhosla
New Contributor III
  • 0 kudos

Hi @BillGuyTheScien  Greetings! Currently, we do not have such a feature to combine multiple cloud usage into a single account. We do have a feature request for the same and it is considered for future. Currently, there is no ETA on that. You can bro...

  • 0 kudos
anushajalesh28
by New Contributor II
  • 2563 Views
  • 2 replies
  • 1 kudos

Catalog issue

When i was trying to create catalog i got an error saying to mention azure storage account and storage container in the following query -CREATE CATALOG IF NOT EXISTS Databricks_Anu_Jal_27022024MANAGED LOCATION 'abfss://<databricks-workspace-stack-anu...

Get Started Discussions
Azure Databricks
  • 2563 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @anushajalesh28, To create a catalog in Azure Databricks, you need to specify the Azure storage account and storage container in the MANAGED LOCATION clause.    Let’s break down the query:   CREATE CATALOG IF NOT EXISTS Databricks_Anu_Jal_270220...

  • 1 kudos
1 More Replies
nan
by New Contributor II
  • 2114 Views
  • 1 replies
  • 0 kudos

TIMEZONE

Can I get some help from Databricks to help me understand how those timestamps being interpreted? Some are really confusing me. I have timestamp coming into AWS Databricks as String type. And the string timestamp is represented in UTC. I ran below qu...

  • 2114 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @nan,  Let’s dive into the intricacies of timestamp interpretation in Databricks on AWS. Timestamp Type in Databricks: A timestamp in Databricks represents an absolute point in time, comprising values for year, month, day, hour, minute, and se...

  • 0 kudos
Surajv
by New Contributor III
  • 2575 Views
  • 1 replies
  • 0 kudos

Run spark code in notebook by setting spark conf instead of databricks connect configure in runtime

Hi community, I wanted to understand if there is a way to pass config values to spark session in runtime than using databricks-connect configure to run spark code. One way I found out is given here: https://stackoverflow.com/questions/63088121/config...

  • 2575 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Surajv, To pass configuration values to a Spark session at runtime in PySpark without relying on the Databricks Connect configuration, you can access and set Spark configuration parameters programmatically. First, retrieve the current Spark conte...

  • 0 kudos
Flachboard84
by New Contributor II
  • 3198 Views
  • 4 replies
  • 1 kudos

sparkR.session

Why might this be erroring out? My understanding is that SparkR is built into Databricks.Code:library(SparkR, include.only=c('read.parquet', 'collect'))sparkR.session() Error:Error in sparkR.session(): could not find function "sparkR.session"

  • 3198 Views
  • 4 replies
  • 1 kudos
Latest Reply
Flachboard84
New Contributor II
  • 1 kudos

It happens with any code; even something as simple as...x <- 2 + 2

  • 1 kudos
3 More Replies
Sujitha
by Community Manager
  • 1378 Views
  • 0 replies
  • 0 kudos

🌟 Welcome Newcomers! 🌟

Hello and welcome to our wonderful Community!Whether you are here by chance or intention, we're thrilled to have you join us. Before you dive into the plethora of discussions and activities happening here, we'd love to get to know you better! ...

  • 1378 Views
  • 0 replies
  • 0 kudos
Data_Engineer3
by Contributor II
  • 1228 Views
  • 4 replies
  • 0 kudos

Delete databricks community post

Hi All,If in case if I make any mistake in my previous post in the databricks community, how can delete the post which I was posted and got replied on the same. Is this possible to delete the old post which was already replied by someone else.Thanks,

  • 1228 Views
  • 4 replies
  • 0 kudos
Latest Reply
Data_Engineer3
Contributor II
  • 0 kudos

I need to delete my old post which contain the details which should not be shared.https://community.databricks.com/t5/data-engineering/need-to-define-the-struct-and-array-of-struct-field-colum-in-the/m-p/58131#M31022 

  • 0 kudos
3 More Replies
Peter_Jones
by New Contributor III
  • 2977 Views
  • 2 replies
  • 0 kudos

Syntax of UPDATE Command in DataBricks

Hi All,I am testing the sql generated by our ETL software to see if it can run on data bricks SQL which I believe is Delta Tables underneath. This is the statement we are testing. As far as I can tell from the manual the from clause is not supported ...

  • 2977 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Peter_Jones,  1. Update Command in Databricks SQL: The UPDATE statement in Databricks SQL allows you to modify column values for rows that match a specified predicate. Here’s the syntax for the UPDATE statement: UPDATE table_name [table_alias]...

  • 0 kudos
1 More Replies
dollyb
by Contributor
  • 2764 Views
  • 13 replies
  • 1 kudos

Databricks Connect Scala -

Hi,I'm using Databricks Connect to run Scala code from IntelliJ on a Databricks single node cluster.Even with the simplest code, I'm experiencing this error:org.apache.spark.SparkException: grpc_shaded.io.grpc.StatusRuntimeException: INTERNAL: org.ap...

  • 2764 Views
  • 13 replies
  • 1 kudos
Latest Reply
dollyb
Contributor
  • 1 kudos

I just hope Databricks will pay attention to it.

  • 1 kudos
12 More Replies
vigneshp
by New Contributor
  • 786 Views
  • 1 replies
  • 0 kudos

bitmap_count() function's output is different in databricks compared to snowflake

I have found that the results of the bitmap_count() function output differs significantly between databricks and snowflake.eg: snowflake returns a value of '1' for this code. "select bitmap_count(X'0001056c000000000000') " while  Databricks returns a...

vigneshp_1-1701992518337.png vigneshp_0-1701992493192.png
  • 786 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ayushi_Suthar
Honored Contributor
  • 0 kudos

Hi @vigneshp , Good Day!  In Databricks, bitmap_count function returns the number of bits set in a BINARY string representing a bitmap. This function is typically used to count distinct values in combination with the bitmap_bucket_number() and the bi...

  • 0 kudos
Peter_Jones
by New Contributor III
  • 4214 Views
  • 6 replies
  • 0 kudos

Resolved! Clusters are failing to launch

Hi Guys,I am a complete newbie to data bricks, we are trying to figure out if our data models and ETL can run on it.I have got the failure to launch message. I have read this message as well.https://community.databricks.com/t5/data-engineering/cluste...

PeterJones_0-1708350996925.png
  • 4214 Views
  • 6 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Peter_Jones, Let’s tackle this cluster launch issue step by step. Quota Exceeded Error: The error message indicates that your cluster launch failed due to exceeding the approved quota for standardEDSv4Family Cores in the westeurope location. ...

  • 0 kudos
5 More Replies
farazanwar
by New Contributor II
  • 3074 Views
  • 1 replies
  • 0 kudos

Jira Connector

Is there any native connector in development fir JIRA

  • 3074 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @farazanwar, Jira Software offers a wide range of integrations to enhance its functionality. While there isn’t a native connector specifically developed by the Jira team, there are several apps (also known as add-ons or plugins) available in the A...

  • 0 kudos
Phani1
by Valued Contributor
  • 751 Views
  • 1 replies
  • 1 kudos

Huge data migration from HDFS to Databricks

Hi Team,Could you please help me what is the best way/best practices to copy around 3 TB of data(parquet) from HDFS to Databricks delta format and create external tables on top of it?Regards,Phanindra

  • 751 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Hi @Phani1, To efficiently copy around 3 TB of Parquet data from HDFS to Databricks Delta format and create external tables, you can follow these best practices: Use the COPY INTO SQL Command: The COPY INTO SQL command allows you to load data fr...

  • 1 kudos
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels
Top Kudoed Authors