when i try to create an dataframe like this lstOfRange = list()
lstOfRange = [ ['CREDIT_LIMIT_RANGE',Decimal(10000000.010000),Decimal(100000000000000000000000.000000),'>10,000,000','G'] ]
RangeSchema = StructType([StructField("rangeType",St...
Hi @sai_sathya, The issue you’re encountering with the value in the rangeTo column of your DataFrame is related to the precision of floating-point numbers.
Let’s break down what’s happening:
Floating-Point Precision:
Computers represent floating...
I am trying to load a dataframe from Databricks to target Oracle table using the write method and using JDBC api. I have the right drivers. The job and its corresponding stages are getting completed and the data is getting loaded in Oracle target tab...
Thanks for the response. Can you please elaborate on the Apache Spark JDBC Connector. I am using ojdbc8 driver as per the Databricks documentation. I am not using Delta Lake. I have the data in a dataframe and using write method to insert the data to...
Hi, I am trying to install the "igraph" and "networkD3" CRAN packages for use within a notebook, but am receiving the below error.Could someone please assist?Thanks! * installing *source* package ‘igraph’ ...
** package ‘igraph’ successfully unpacked...
Based on this igraph github issue https://github.com/igraph/rigraph/issues/490#issuecomment-966890059, I followed the instructions to install glpk. After installing glpk, I was able to install igraph.
We aim to reduce the amount of notebooks we create to a minimum and instead make these fairly flexible. Therefore we have a Factory setup that takes in a parameter to varies the logic.However when it comes to Workflows we are forced to create multipl...
Did you figure out if this was possible?I too find it that we have too many workflows and I would rather have them combined, but have different parts or the workflow run on different schedules.
Hi,I am trying to install the igraph and networkD3 CRAN packages for use within a notebook. However, I am receiving the attached installation error when attempting to do so.Could someone please assist?Thank you!
What Can Be Causing QB Won't Open Issue and How Can I Fix It? I need help immediately to fix this annoying issue! Has anybody else had such problems with QB refusing to open? My personal attempts at troubleshooting have yielded no results. I would be...
@markwilliam8506 If your QB won't open even after multiple tries, you might be facing some common error messages. This scenario can be a result of damaged program files or a faulty installation process, among other possible reasons. The error message...
Hi everyone !I'm encountering an issue while trying to serve my model on a GPU endpoint.My model is using deespeed that needs I got the following error : "An error occurred while loading the model. CUDA_HOME does not exist, unable to compile CUDA op(...
Hi @kfab,
It seems you’re encountering an issue related to CUDA while serving your model on a GPU endpoint.
Let’s troubleshoot this step by step.
CUDA_HOME Not Found: The error message you received, “CUDA_HOME does not exist, unable to compile C...
Hi,I am trying to deploy mlflow model in Sagemaker. My mlflow model is registered in Databrick.Followed below url to deploy and it need ECR for deployment. For ECR, either I can create custom image and push to ECR or its mentioned in below url to get...
Hi @sanjay, Deploying an MLflow model to Amazon SageMaker is a great way to scale your machine learning inference containers. MLflow simplifies the deployment process by providing easy-to-use commands without requiring you to write complex container...
Hi,I have a simple python notebook with below code ----query = "select table_catalog, table_schema, table_name from system.information_schema.tables where table_type!='VIEW' and table_catalog='TEST' and table_schema='TEST'"test = spark.sql(query)disp...
Hi @Sanky, It seems you’re encountering an issue where your Spark job, running as a service principal, doesn’t return any results when querying the same code that works in your workspace.
Let’s troubleshoot this:
Service Principal Permissions:
Y...
tldr: Notebook selected "Editor theme (New)" is not being retained after viewing "push code code to repo" screen.I believe I have the answer to this issue.What's occurring and why:1. User selects: View --> Editor theme --> <<theme>> (ie: Monokai)2. U...
Hello,Currently I have created databricks tables in the hive_metastore.databasesTo read these tables using a select * query inside the databricks notebook, I have to make sure the databrcks cluster is started.Question is to do with reading the databr...
Hello,I am getting the below error while trying to convert my features using vector assembler in unity catalog cluster I tried setting up the config like mentioned in a different post, but it did not work still. Could use some help here.Thank you..
I have a SQL server transactional database on an EC2 instance, and an AWS Glue job that pulls full tables in parquet files into an S3 bucket. There is a very large table that has 44 million rows, and records are added, updated and deleted from this t...
If you have a CDC stream capability, you can use the APPLY CHANGES INTO API to perform SCD1, or SCD2 in a Delta Lake table in Databricks. You can find more information here. This is the best way to go if CDC is a possibility.If you do not have a CD...
I'm trying to connect to databricks from java using the java sdk and get cluster/sqlWarehouse state. I'm able to connect and get cluster state from my local. But, once I deploy it to the server, my company's network is not allowing the connection. We...
Hi @Nagasundaram
You can make use of the below init script inorder to use a proxy server with Databricks cluster. The content of the init script can be added at "Workspace/shared/setproxy.sh"
==================================================
v...
How to add instance profile permission to all user via databricks-sdk workspace client. Just like terraform where we can give "users" for all users , how can we don same using databricks-sdk workspace-client. I cannot find permission for instance pro...
Hi @samarth_solanki,
To manage instance profiles in Databricks and grant access to users, you can follow these steps:
Using the Admin Settings Page:
As a workspace admin, go to the admin settings page.Click the Instance Profiles tab.Select the i...