cancel
Showing results for 
Search instead for 
Did you mean: 
Training offerings
Explore discussions on Databricks training programs and offerings within the Community. Get insights, recommendations, and support from peers to maximize your learning experience and advance your skills in data analytics and machine learning.
cancel
Showing results for 
Search instead for 
Did you mean: 

ADE 1.1 - Streaming Design Patterns issue running Classroom-Setup-2.1

AviG
New Contributor II

Error Description:

--------------------------------------------------------------------------- AnalysisException Traceback (most recent call last) File , line 4 1 DA.paths.checkpoints = f"{DA.paths.working_dir}/_checkpoints" 3 DA.cleanup() ----> 4 DA.init() 5 DA.conclude_setup() File , line 58, in DBAcademyHelper.init(self, create_db) 56 if create_db: 57 print(f"\nCreating the database \"{self.db_name}\"") ---> 58 spark.sql(f"CREATE DATABASE IF NOT EXISTS {self.db_name} LOCATION '{self.paths.user_db}'") 59 spark.sql(f"USE {self.db_name}") 61 self.initialized = True File /databricks/spark/python/pyspark/instrumentation_utils.py:48, in _wrap_function.<locals>.wrapper(*args, **kwargs) 46 start = time.perf_counter() 47 try: ---> 48 res = func(*args, **kwargs) 49 logger.log_success( 50 module_name, class_name, function_name, time.perf_counter() - start, signature 51 ) 52 return res File /databricks/spark/python/pyspark/sql/session.py:1602, in SparkSession.sql(self, sqlQuery, args, **kwargs) 1598 assert self._jvm is not None 1599 litArgs = self._jvm.PythonUtils.toArray( 1600 [_to_java_column(lit(v)) for v in (args or [])] 1601 ) -> 1602 return DataFrame(self._jsparkSession.sql(sqlQuery, litArgs), self) 1603 finally: 1604 if len(kwargs) > 0: File /databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py:1322, in JavaMember.__call__(self, *args) 1316 command = proto.CALL_COMMAND_NAME +\ 1317 self.command_header +\ 1318 args_command +\ 1319 proto.END_COMMAND_PART 1321 answer = self.gateway_client.send_command(command) -> 1322 return_value = get_return_value( 1323 answer, self.gateway_client, self.target_id, self.name) 1325 for temp_arg in temp_args: 1326 if hasattr(temp_arg, "_detach"😞 File /databricks/spark/python/pyspark/errors/exceptions/captured.py:194, in capture_sql_exception.<locals>.deco(*a, **kw) 190 converted = convert_exception(e.java_exception) 191 if not isinstance(converted, UnknownException): 192 # Hide where the exception came from that shows a non-Pythonic 193 # JVM exception message. --> 194 raise converted from None 195 else: 196 raise AnalysisException: CREATE SCHEMA in Unity Catalog must use MANAGED LOCATION, not LOCATION

2 REPLIES 2

AviG
New Contributor II

I found the solution to this issue. Somehow my Default catalog for the workspace was pointing to a different catalog instead of hive_metastore. Performed the necessary changes and its working now.

kaibarola
New Contributor II

How do you do that?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group