by
Bhanu1
• New Contributor III
- 741 Views
- 2 replies
- 0 kudos
The new horizontal view of tasks *****. Can we please have the option for vertical view of a workflow?
- 741 Views
- 2 replies
- 0 kudos
Latest Reply
Hi Debayan,This was how workflows used to look like before These are now shown from left to right instead of from top to bottom. It is a pain to scroll through a long workflow now as mouses don't have the capability to scroll left and right.
1 More Replies
- 537 Views
- 1 replies
- 0 kudos
SELECT if((select count(*) from information_schema.table_privileges where grantee = 'samo@test.com' and table_schema='demo_schema' and table_catalog='demo_catalog')==1, (select count(*) from demo_catalog.demo_schema.demo_table), (select count(*) from...
- 537 Views
- 1 replies
- 0 kudos
Latest Reply
Hi, GRANT and REVOKE are privileges on an securable object to a principal. And a principal is a user, service principal, or group known to the metastore. Principals can be granted privileges and may own securable objects.Also, you can use REVOKE ON S...
- 18210 Views
- 8 replies
- 9 kudos
I am looking for something preferably similar to Windows task manager which we can use for monitoring the CPU, memory and disk usage for local desktop.
- 18210 Views
- 8 replies
- 9 kudos
Latest Reply
Some important info to look in Gangalia UI in CPU, memory and server load charts to spot the problem:CPU chart :User %Idle %High percentage of user % indicates heavy CPU usage in the cluster.Memory chart : Use %Free %Swap % If you see purple line ove...
7 More Replies
- 9189 Views
- 6 replies
- 13 kudos
I have a field stored as a string in the format "12/30/2022 10:30:00 AM"If I use the function TO_DATE, I only get the date part... I want the full date and time.If I use the function TO_TIMESTAMP, I get the date and time, but it's assumed to be UTC, ...
- 9189 Views
- 6 replies
- 13 kudos
Latest Reply
use from_utc_timestamp(to_timestam("<string>", <format>),<timezone>)
5 More Replies
- 2979 Views
- 4 replies
- 6 kudos
Some silly questions folks. I took online proctored Databricks spark certification couple of days back and my unofficial result was pass. I received a mail that it might https://speedtest.vet/ take upto one week to receive the certification, if awar...
- 2979 Views
- 4 replies
- 6 kudos
Latest Reply
better would have been to ask for permission before drinking. I can share my exp. My mobile alarm started buzzing during the exam, I requested the moderator, he then paused the exam and asked me to take my laptop to the mobile and then to switch off,...
3 More Replies
by
elgeo
• Valued Contributor II
- 13891 Views
- 5 replies
- 1 kudos
Hello. Is there an equivalent of SQL stored procedure in Databricks? Please note that I need a procedure that allows DML statements and not only Select statement as a function provides.Thank you in advance
- 13891 Views
- 5 replies
- 1 kudos
- 9259 Views
- 7 replies
- 13 kudos
The data looks like this:
pageId]|[page]|[Position]|[sysId]|[carId 0005]|[bmw]|[south]|[AD6]|[OP4
There are atleast 50 columns and millions of rows.
I did try to use below code to read:
dff = sqlContext.read.format("com.databricks.spark.csv").option...
- 9259 Views
- 7 replies
- 13 kudos
Latest Reply
you might also try the blow option.1). Use a different file format: You can try using a different file format that supports multi-character delimiters, such as text JSON.2). Use a custom Row class: You can write a custom Row class to parse the multi-...
6 More Replies
by
Marcel
• New Contributor III
- 20359 Views
- 4 replies
- 2 kudos
Hi Databricks Community,I want to set environment variables for all clusters in my workspace.The goal is to have environment (dev, prod) specific environment variables values.Instead of set the environment variables for each cluster, a global script ...
- 20359 Views
- 4 replies
- 2 kudos
Latest Reply
We have set the env variable at Global Init script as below,sudo echo DATAENV=DEV >> /etc/environmentand we try to access the variable in notebook that run with "Shared" cluster mode. import os
print(os.getenv("DATAENV"))But the env variable is not a...
3 More Replies
- 951 Views
- 3 replies
- 3 kudos
Is there a way to remove the "exit" Button from the fullscreen within the sparks Notebook - Dashboard ?
- 951 Views
- 3 replies
- 3 kudos
Latest Reply
Could you please share a screenshot of what you see. I dont see any exit button. Or I might be looking at a wrong place.
2 More Replies
by
519776
• New Contributor III
- 5044 Views
- 15 replies
- 1 kudos
Hi, I would like to connect our BigQuery env to Databricks, So I created a service account but where should I configure the service account in Databricks? I read databricks documention and it`s not clear at all. Thanks for your help
- 5044 Views
- 15 replies
- 1 kudos
Latest Reply
@kfiry adding to @Werner Stinckens did you added projectid in read spark query , projectid should be one where big query instance running. also please follow best practices in terms of egress data cost spark.read.format("bigquery") \ .option("tabl...
14 More Replies
by
yousry
• New Contributor II
- 1388 Views
- 2 replies
- 2 kudos
To identify certain deltalake features available on a certain installation, it is important to have a robust way to identify deltalake version. For OSS, I found that the below Scala snippet will do the job.import io.delta
println(io.delta.VERSION)Not...
- 1388 Views
- 2 replies
- 2 kudos
Latest Reply
@Yousry Mohamed - could you please check the DBR runtime release notes for the Delta lake API compatibility matrix section ( DBR version vs Delta lake compatible version) for the mapping.Reference: https://docs.databricks.com/release-notes/runtime/r...
1 More Replies
- 5231 Views
- 7 replies
- 12 kudos
We have a delta streaming source in our delta live table pipelines that may have data deleted from time to time. The error message is pretty self explanatory:...from streaming source at version 191. This is currently not supported. If you'd like to i...
- 5231 Views
- 7 replies
- 12 kudos
Latest Reply
I'd am looking at this as well and would like to understand my options here.
6 More Replies
- 1495 Views
- 3 replies
- 2 kudos
I am thinking of using delta live table, before that I want to be aware of the limitations it has as of now when it s announced on datasummit 2021
- 1495 Views
- 3 replies
- 2 kudos
Latest Reply
There doesn't appear to be a way to enforce a retention policy on source tables when defining a structured stream. Setting the options for "ignoreChanges" and "ignoreDeletes" doesn't seem to have any effect at all. CDC does not fill this role either,...
2 More Replies
by
janwoj
• New Contributor II
- 3727 Views
- 4 replies
- 2 kudos
Hello,I would like to read Databricks delta table to show the data on the screen using PowerApps gallery and insert new records to the same table also. What is the best method to achieve an efficient connection and perform above?Cheers
- 3727 Views
- 4 replies
- 2 kudos
Latest Reply
Anyone find a solution to this yet? I'm currently investigating the same issue. Currently the only one I can find is paying for a third-party tool to set it up. Thanks,
3 More Replies
- 3045 Views
- 2 replies
- 0 kudos
Hi everyone!I am having a problem! I can Grant or Revoke privileges from users using the UI on databricks, but when I try to do that using SQL commands such as GRANT SELECT ON SCHEMA [... ] TO [USER]; I get an error stating Operation not allowed.I am...
- 3045 Views
- 2 replies
- 0 kudos
Latest Reply
Its solved, the problem was I was using '' insted of ``
1 More Replies