05-24-2022 02:18 AM
dbutils.widgets.text('table', 'product')
%sql
select *
from ds_data.$table
Hello, the above will work.
But how can I do something like:
dbutils.widgets.text('table', 'product')
%sql
select *
from ds_data.$table_v3
in that example, $table is still my table name, and I want to add a suffix.
I could go in python
dbutils.widgets.text('table', 'product')
tab = dbutils.widgets.get('table')
spark.sql(f'select * from ds_data.{tab}_v3')
but would love doing it in SQL.
Any idea?
thanks!
05-24-2022 08:36 AM
I checked further and found that the command that I used is not helpful. Though it takes the table name, it is pulling data from the agents table but not the agents_v1 table. The agents table has 12 rows and agent_v1 table has 20 rows. So this doesn't work. We need to programmatically call the table with the suffix. I don't think its possible with sql.
05-24-2022 04:44 AM
05-24-2022 05:01 AM
Thanks!
Actually, did not work for me 😞
this:
%sql
select * from prabakar.`$table`_v1
would give:
Error in SQL statement: AnalysisException: Table or view not found: prabakar.table_name;
seems like missing the "_v1" anyway
05-24-2022 08:31 AM
looks like you are using the example that I share `Table or view not found: prabakar.table_name;` Can you test with your sample data.
05-24-2022 08:34 AM
I tried using my sample data, translated the message to match yours
05-24-2022 08:36 AM
I checked further and found that the command that I used is not helpful. Though it takes the table name, it is pulling data from the agents table but not the agents_v1 table. The agents table has 12 rows and agent_v1 table has 20 rows. So this doesn't work. We need to programmatically call the table with the suffix. I don't think its possible with sql.
05-24-2022 08:39 AM
ok, thanks. Sad, forces us to move to python notebooks for something that was fitting very well a sql notebook. but that's life 🙂
thanks!
05-24-2022 09:13 AM
🙂
05-24-2022 05:02 AM
Maybe I should add that I use DB9.1 on a high concurrency cluster
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group