About 11,400,000 results
Open links in new tab
  1. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · 1 Building on @camo's answer, since you're looking to use the secret value outside Databricks, you can use the Databricks Python SDK to fetch the bytes representation …

  2. Databricks: managed tables vs. external tables - Stack Overflow

    Jun 21, 2024 · While Databricks manages the metadata for external tables, the actual data remains in the specified external location, providing flexibility and control over the data storage …

  3. how to get databricks job id at the run time - Stack Overflow

    Jun 9, 2025 · 1 I am trying to get the job id and run id of a databricks job dynamically and keep it on in the table with below code

  4. REST API to query Databricks table - Stack Overflow

    Jul 24, 2022 · Is databricks designed for such use cases or is a better approach to copy this table (gold layer) in an operational database such as azure sql db after the transformations are done …

  5. Databricks Permissions Required to Create a Cluster

    Nov 9, 2023 · In Azure Databricks, if you want to create a cluster, you need to have the " Can Manage " permission. This permission basically lets you handle everything related to clusters, …

  6. Databricks: How do I get path of current notebook?

    Databricks is smart and all, but how do you identify the path of your current notebook? The guide on the website does not help. It suggests: %scala dbutils.notebook.getContext.notebookPath …

  7. Converting SQL stored procedure into a Databricks Notebook: …

    Dec 5, 2023 · 1 I'm trying to convert a SQL stored procedure into a Databricks notebook. One stored procedure has multiple IF statements combined with BEGIN/END statements. Based …

  8. Can't authenticate deploy of Databricks bundle in Azure pipeline …

    Nov 3, 2023 · Issue Trying to deploy a Databricks bundle within an Azure pipeline. Databricks CLI = v0.209.0 Bundle artifact is downloaded to the vm correctly. Conducted via these instructions: …

  9. How do we connect Databricks with SFTP using Pyspark?

    Aug 17, 2022 · I wish to connect to sftp (to read files stored in a folder) from databricks cluster using Pyspark (using a private key) . Historically I have been downloading files to a linux box …

  10. Databricks CREATE VIEW equivalent in PySpark - Stack Overflow

    Jun 24, 2023 · Can someone let me know what the equivalent of the following CREATE VIEW in Databricks SQL is in PySpark? CREATE OR REPLACE VIEW myview as select …