- 794 Views
- 1 replies
- 0 kudos
Setting up a proxy for python notebook
Hi all,I am running a python notebook with a web scraper. I want to setup a proxy server that I can use to avoid any IP bans when scraping. Can someone recommend a way to setup a proxy server that can be used from HTTP requests send from a Databricks...
- 794 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi Kaniz,Thanks for the reply. I know how to include HTTP proxies in my python code and redirect the requests. However, I wondered if Databricks has any functionality to setup the proxies?Thank you
- 0 kudos
- 511 Views
- 1 replies
- 0 kudos
Unity Catalog cannot display(), but can show() table
Hello all,I'm facing the following issue in a newly setup Azure Databricks - Unity Catalog environment:Failed to store the result. Try rerunning the command.Failed to upload command result to DBFS. Error message: PUT request to create file error Http...
- 511 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @CharlesDLW , You have similar use case to the one below. Follow my reply in that thread: https://community.databricks.com/t5/community-discussions/file-found-with-fs-ls-but-not-with-spark-read/m-p/78618/highlight/true#M5972
- 0 kudos
- 1412 Views
- 2 replies
- 2 kudos
Resolved! jdbc errors when parameter is a boolean
I'm trying to query a table from Java code. The query works when I use a databricks notebook / query editor directly in Databricks. However, when using Jdbc with Spring, I get following stacktrace. org.springframework.jdbc.UncategorizedSQLException:...
- 1412 Views
- 2 replies
- 2 kudos
- 2 kudos
As I see it, there's two things:jdbcTemplate converts boolean to bit. This is according to JDBC specs (this is a "spring-jdbc" thing and according to documentation; the jdbcTemplate.queryForList makes the best possible guess of the desired type).Data...
- 2 kudos
- 761 Views
- 1 replies
- 0 kudos
Databricks/Terraform - Error while creating workspace
Hi - I have below code to create the credentials, storage and workspace through terraform script but only credentials and storage is created but failed to create the workspace with error. Can someone please guide/suggest what's wrong with the code/l...
- 761 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Shrinivas , Could you share with us how you configured datbricks provider?
- 0 kudos
- 609 Views
- 1 replies
- 0 kudos
Databricks exam got suspended without any reason. Immediate assistance required
Hello Team, @Cert-Team @Cert-Bricks I had my exam yesterday and had a Pathetic experience while attempting my 1st DataBricks certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. before that somehow 2 ...
- 609 Views
- 1 replies
- 0 kudos
- 0 kudos
@Retired_mod @Cert-Team @Cert-Bricks Thanks so much for responding. But I have been waiting from day before yesterday, and yet have not received any any response for the ticket. Can you please look into it.I appreciate it.
- 0 kudos
- 826 Views
- 0 replies
- 0 kudos
Unexpected response from server during a HTTP connection: authorize: cannot authorize peer.
Hi all,When attempting to connect to Databricks with Spark ODBC using the regular host ip and port, everything is successful. However, we need to send the connection through an internal proxy service that re-maps the server's endpoint to a local port...
- 826 Views
- 0 replies
- 0 kudos
- 1518 Views
- 2 replies
- 2 kudos
Resolved! Connecting Power BI to aws databricks using service principal
Hi,I am trying to connect AWS Databricks to PowerBI using service principal. Below are the steps I followed:1. Created Service Principal in Identity and Access2. I went to Permission setting page under Settings>Advanced and added this new service pri...
- 1518 Views
- 2 replies
- 2 kudos
- 2 kudos
Thanks for the reply, I got the solution. I was missing adding SP account in SQL Warehouse permission setting. After adding it, my PBI report is working fine.
- 2 kudos
- 586 Views
- 1 replies
- 1 kudos
How to get the JSON definition - "CREATE part" for a job using JOB ID or JOB Name
I want to get the JSON definition of the "create part" of the job. I have the job id and job name. I am using databricks notebook for this. I can the get the "GET" json api definition but not able able to get the "CREATE" part json definition which I...
- 586 Views
- 1 replies
- 1 kudos
- 985 Views
- 2 replies
- 0 kudos
Resolved! How can I increase the hard capacity of the master node?
I'm not sure if this is the right place to post my question. If not, please let me know where I should post my question. I want to download large files from the web from Databricks' master(driver) node. For example, I fetch a file over 150GB via API ...
- 985 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @himanmon,If you 100% sure that you can't download this file to storage account configured with unity catalog and you want it directly on driver node local storage, then why can't you just increase local disk space by choosing a larger instance ty...
- 0 kudos
- 1185 Views
- 5 replies
- 0 kudos
Resolved! File found with %fs ls but not with spark.read
Code: wikipediaDF = (spark.read .option("HEADER", True) .option("inferSchema", True) .csv("/databricks-datasets/wikipedia-datasets/data-001/pageviews/raw/pageviews_by_second.tsv"))display(bostonDF) Error: Failed to store the result. Try rerunning ...
- 1185 Views
- 5 replies
- 0 kudos
- 0 kudos
Update 1: Apparently the problem shows up when using display(), using show() or display(df.limit()) works fine. I also started using the premium pricing tier, I'm going to see what happens if I use the free 14 days trial pricing tier.Update 2: I trie...
- 0 kudos
- 712 Views
- 0 replies
- 0 kudos
Creating a table in ADB SQL for multiple JSON files and selecting all the rows from all the files
HiI have multiple json files stored in my ADLS2 and I want to create a table in which will directly read all the data from ADLS without mounting the files. When I create the table, i cannot select all the data How can i achieve this.ADLS Path : /dwh/...
- 712 Views
- 0 replies
- 0 kudos
- 1528 Views
- 2 replies
- 0 kudos
OAuth user-to-machine (U2M) authentication
I am trying to use OAuth user-to-machine (U2M) authentication from azure databricks CLI.When I run databricks auth login --host ,I get a web browser open and get authentication sucessfull message and My profile also save successfully with auth-type...
- 1528 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Aria , Good Day! Which CLI version you are using here? Can you try to update the CLI version to a newer version by referring to this document : https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/install#--homebrew-update-for-linux-...
- 0 kudos
- 3664 Views
- 2 replies
- 1 kudos
Issue with Private PyPI Mirror Package Dependencies Installation
I'm encountering an issue with the installation of Python packages from a Private PyPI mirror, specifically when the package contains dependencies and the installation is on Databricks clusters - Cluster libraries | Databricks on AWS. Initially, ever...
- 3664 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @hugodscarvalho ,I am also at this point, where the transitive dependencies (available in jfrog) are not getting installed in my job cluster. Could you please elaborate a bit on what exactly needed to be changed in the JFrog setup for this to work...
- 1 kudos
- 606 Views
- 1 replies
- 0 kudos
Using model serving from databricks privacy issue
When using Databrick's model serving to query Llama3, I noticed the endpoint URL is my databricks instance. Does that still mean data is sent to databricks process by databricks? If so, does databricks keep/use any of the data sent to model serving ...
- 606 Views
- 1 replies
- 0 kudos
- 0 kudos
Related question, when databricks process requests for foundational model, I noticed the latency is pretty small, wondering what kind of processing power is used on databricks side? I am interested in hosting model ourselves so wondering what type of...
- 0 kudos
- 1883 Views
- 1 replies
- 1 kudos
Vector Search index not indexing the whole Delta table
I have a Delta table that I’m trying to index but when I try to create a vector search index with either the UI or the Python SDK, it only indexes 1 row out of my 3000 rows. I have tried using different vector search endpoints. I have verified the fo...
- 1883 Views
- 1 replies
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »