- 39 Views
- 1 replies
- 0 kudos
dev and prod
"SELECT * FROM' data call on my table in PROD is giving all the rows of data, but a call on my table in DEV is giving me just one row of data. what could be the problem??
- 39 Views
- 1 replies
- 0 kudos
- 0 kudos
Tell us more about your environment. Are you using Unity Catalog? What is the table format? What cloud platform are you on? More information is needed.
- 0 kudos
- 49 Views
- 1 replies
- 0 kudos
Search page to search code inside .py files
Hello, hope you are doing good.When on the search page, it seems it's not searching for code inside .py files but rather only the filename.Is there an option somewhere I'm missing to be able to search inside .py files ? Best,Alan
- 49 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @BigAlThePal As per my understanding, there isn’t a built-in option in the Databricks workspace to search inside .py files directly.You could try a few workarounds though, like using the Databricks REST API to list .py files and programmatically s...
- 0 kudos
- 21 Views
- 0 replies
- 0 kudos
Issue with Disabled "Repair DAG", "Repair All DAGs" Buttons in Airflow UI, functionality is working.
We are encountering an issue in the Airflow UI where the 'Repair DAG' and 'Repair All DAGs' options are disabled when a specific task fails. While the repair functionality itself is working properly (i.e., the DAGs can still be repaired through execu...
- 21 Views
- 0 replies
- 0 kudos
- 106 Views
- 2 replies
- 1 kudos
Replacing Excel with Databricks
I have a client that currently uses a lot of Excel with VBA and advanced calculations. Their source data is often stored in SQL Server.I am trying to make the case to move to Databricks. What's a good way to make that case? What are some advantages t...
- 106 Views
- 2 replies
- 1 kudos
- 66 Views
- 2 replies
- 0 kudos
Will auto loader read files if it doesn't need to?
I want to run auto loader on some very large json files. I don't actually care about the data inside the files, just the file paths of the blobs. If I do something like``` spark.readStream .format("cloudFiles") .option("cloudFiles.fo...
- 66 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @charliemerrell Yes, Databricks will still open and parse the JSON files, even if you're only selecting _metadata.It must infer schema and perform basic parsing, unless you explicitly avoid it.So, even if you do:.select("_metadata")It doesn't skip...
- 0 kudos
- 106 Views
- 1 replies
- 0 kudos
Trusted assets vs query examples
¡Hi community! In recent days I explored trusted assets in my genie space and this working very well! but I feel a little confused :sIn my genie space I have many queries examples when I create a new function with the same query example for verify th...
- 106 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Dulce42! It depends on your use case. If your function covers the scenario well, you don’t need a separate query example. Having both for the same purpose can create redundancy and make things more complex. Choose the option that best fits you...
- 0 kudos
- 96 Views
- 1 replies
- 1 kudos
Completed Machine learning course
I have completed my course for Machine learning as part of Learning festival.
- 96 Views
- 1 replies
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up NowUser | Count |
---|---|
42 | |
3 | |
1 | |
1 | |
1 |