Skip to content

Commit acdd41f

Browse files
committed
Implement PR feedback
1 parent 062b789 commit acdd41f

File tree

5 files changed

+17
-125
lines changed

5 files changed

+17
-125
lines changed

databricks/.env.example

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,5 +18,6 @@ DATABRICKS_TOKEN=your-databricks-token
1818
DATABRICKS_CATALOG=clickbench_catalog
1919
DATABRICKS_SCHEMA=clickbench_schema
2020

21-
# Parquet data location
22-
DATABRICKS_PARQUET_LOCATION=s3://some/path/hits.parquet
21+
# Parquet data location (must use s3:// format)
22+
DATABRICKS_PARQUET_LOCATION=s3://clickhouse-public-datasets/hits_compatible/hits.parquet
23+

databricks/NOTES.md

Lines changed: 0 additions & 4 deletions
This file was deleted.

databricks/README.md

Lines changed: 1 addition & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
## Setup
22

3-
1. Create a Databricks workspace and SQL Warehouse
3+
1. Create a Databricks workspace and SQL Warehouse (you can do this in the Datbricks UI). Once the SQL Warehouse has been created, copy the warehouse path to use in the .env file
44
2. Generate a personal access token from your Databricks workspace
55
3. Copy `.env.example` to `.env` and fill in your values:
66

@@ -9,15 +9,6 @@ cp .env.example .env
99
# Edit .env with your actual credentials
1010
```
1111

12-
Required environment variables:
13-
- `DATABRICKS_SERVER_HOSTNAME`: Your workspace hostname (e.g., `dbc-xxxxxxxx-xxxx.cloud.databricks.com`)
14-
- `DATABRICKS_HTTP_PATH`: SQL Warehouse path (e.g., `/sql/1.0/warehouses/your-warehouse-id`)
15-
- `DATABRICKS_TOKEN`: Your personal access token
16-
- `databricks_instance_type`: Instance type name for results file naming, e.g., "2X-Large"
17-
- `DATABRICKS_CATALOG`: Unity Catalog name
18-
- `DATABRICKS_SCHEMA`: Schema name
19-
- `DATABRICKS_PARQUET_LOCATION`: S3 path to the parquet file
20-
2112
## Running the Benchmark
2213

2314
```bash

databricks/benchmark.sh

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,18 @@
11
#!/bin/bash
22

3+
# Ensure uv is installed, using snap on Linux and the official installer on macOS
4+
if ! command -v uv >/dev/null 2>&1; then
5+
if command -v snap >/dev/null 2>&1; then
6+
sudo snap install --classic astral-uv
7+
elif [[ "$OSTYPE" == "darwin"* ]]; then
8+
curl -LsSf https://astral.sh/uv/install.sh | sh
9+
export PATH="$HOME/.local/bin:$PATH"
10+
else
11+
echo "Error: uv is not installed and snap is unavailable. Please install uv manually." >&2
12+
exit 1
13+
fi
14+
fi
15+
316
# Load environment variables
417
if [ -f .env ]; then
518
set -a

databricks/create.sql

Lines changed: 0 additions & 109 deletions
This file was deleted.

0 commit comments

Comments
 (0)