Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
95 changes: 95 additions & 0 deletions .agents/skills/verify-local-changes/SKILL.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
---
name: Verify Local Changes
description: Verifies local Java SDK changes.
---

# Verify Local Changes

This skill documents how to verify local code changes for the Java Firestore SDK. This should be run **every time** you complete a fix or feature and are prepared to push a pull request.

## Prerequisites

Ensure you have Maven installed and are in the `java-firestore` directory before running commands.

---

## Step 0: Format the Code

Run the formatter to ensure formatting checks pass:

```bash
mvn com.spotify.fmt:fmt-maven-plugin:format
```

---

## Step 1: Unit Testing (Isolated then Suite)

1. **Identify modified unit tests** in your changes.
2. **Run specific units only** to test isolated logic regressions:
```bash
mvn test -Dtest=MyUnitTest#testMethod
```
3. **Run the entire unit test suite** that contains those modified tests if the isolated unit tests pass:
```bash
mvn test -Dtest=MyUnitTest
```

---

## Step 2: Integration Testing (Isolated then Suite)

### 💡 Integration Test Nuances (from `ITBaseTest.java`)

When running integration tests, configure your execution using properties or environment variables:

- **`FIRESTORE_EDITION`**:
- `standard` (Default)
- `enterprise`
- *Note*: **Pipelines can only be run against `enterprise` editions**, while standard Queries run on both.
- **`FIRESTORE_NAMED_DATABASE`**:
- Enterprise editions usually require a named database (often `enterprise`). Adjust this flag if pointing to specific instances.
- **`FIRESTORE_TARGET_BACKEND`**:
- `PROD` (Default)
- `QA` (points to standard sandboxes)
- `NIGHTLY` (points to `test-firestore.sandbox.googleapis.com:443`)
- `EMULATOR` (points to `localhost:8080`)

1. **Identify modified integration tests** (usually Starting in `IT`).
2. **Run specific integration tests only** (isolated checks run quicker):
```bash
mvn verify -Penable-integration-tests -DFIRESTORE_EDITION=enterprise -DFIRESTORE_NAMED_DATABASE=enterprise -Dtest=ITTest#testMethod -Dclirr.skip=true -Denforcer.skip=true -fae
```
3. **Run the entire integration test suite** for the modified class if isolation tests pass:
```bash
mvn verify -Penable-integration-tests -DFIRESTORE_EDITION=enterprise -DFIRESTORE_NAMED_DATABASE=enterprise -Dtest=ITTest -Dclirr.skip=true -Denforcer.skip=true -fae
```



---

## Step 3: Full Suite Regressions

Run the full integration regression suite once you are confident subsets pass:

```bash
mvn verify -Penable-integration-tests -DFIRESTORE_EDITION=enterprise -DFIRESTORE_NAMED_DATABASE=enterprise -Dclirr.skip=true -Denforcer.skip=true -fae
```

---

> [!TIP]
> Use `-Dclirr.skip=true -Denforcer.skip=true` to speed up iterations where appropriate without leaking compliance checks.

---

## 🛠️ Troubleshooting & Source of Truth

If you run into issues executing tests with the commands above, **consult the Kokoro configuration files** as the ultimate source of truth:

- **Presubmit configurations**: See `.kokoro/presubmit/integration.cfg` (or `integration-named-db.cfg`)
- **Nightly configurations**: See `.kokoro/nightly/integration.cfg`
- **Build shell scripts**: See `.kokoro/build.sh`

These files define the exact environment variables (e.g., specific endpoints or endpoints overrides) the CI server uses!
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@
import com.google.cloud.firestore.pipeline.stages.AddFields;
import com.google.cloud.firestore.pipeline.stages.Aggregate;
import com.google.cloud.firestore.pipeline.stages.AggregateOptions;
import com.google.cloud.firestore.pipeline.stages.Delete;
import com.google.cloud.firestore.pipeline.stages.Distinct;
import com.google.cloud.firestore.pipeline.stages.FindNearest;
import com.google.cloud.firestore.pipeline.stages.FindNearestOptions;
Expand All @@ -58,6 +59,7 @@
import com.google.cloud.firestore.pipeline.stages.Union;
import com.google.cloud.firestore.pipeline.stages.Unnest;
import com.google.cloud.firestore.pipeline.stages.UnnestOptions;
import com.google.cloud.firestore.pipeline.stages.Update;
import com.google.cloud.firestore.pipeline.stages.Where;
import com.google.cloud.firestore.telemetry.MetricsUtil.MetricsContext;
import com.google.cloud.firestore.telemetry.TelemetryConstants;
Expand Down Expand Up @@ -996,7 +998,119 @@ public Pipeline unnest(Selectable field, UnnestOptions options) {
}

/**
* Adds a generic stage to the pipeline.
* Performs a delete operation on documents from previous stages.
*
* <p>Example:
*
* <pre>{@code
* // Delete all documents in the "logs" collection where "status" is "archived"
* firestore.pipeline()
* .collection("logs")
* .where(field("status").equal("archived"))
* .delete()
* .execute()
* .get();
* }</pre>
*
* @return A new {@code Pipeline} object with this stage appended to the stage list.
*/
@BetaApi
public Pipeline delete() {
return append(new Delete());
}

/**
* Performs an update operation using documents from previous stages.
*
* <p>This method updates the documents in place based on the data flowing through the pipeline.
* To specify transformations, use {@link #update(Selectable...)}.
*
* <p>Example 1: Update a collection's schema by adding a new field and removing an old one.
*
* <pre>{@code
* firestore.pipeline()
* .collection("books")
* .addFields(constant("Fiction").as("genre"))
* .removeFields("old_genre")
* .update()
* .execute()
* .get();
* }</pre>
*
* <p>Example 2: Update documents in place with data from literals.
*
* <pre>{@code
* Map<String, Object> updateData = new HashMap<>();
* updateData.put("__name__", firestore.collection("books").document("book1"));
* updateData.put("status", "Updated");
*
* firestore.pipeline()
* .literals(updateData)
* .update()
* .execute()
* .get();
* }</pre>
*
* @return A new {@code Pipeline} object with this stage appended to the stage list.
*/
@BetaApi
public Pipeline update() {
return append(new Update());
}

/**
* Performs an update operation using documents from previous stages with specified
* transformations.
*
* <p>Example:
*
* <pre>{@code
* // Update the "status" field to "Discounted" for all books where price > 50
* firestore.pipeline()
* .collection("books")
* .where(field("price").greaterThan(50))
* .update(constant("Discounted").as("status"))
* .execute()
* .get();
* }</pre>
*
* @param transformedFields The transformations to apply.
* @return A new {@code Pipeline} object with this stage appended to the stage list.
*/
@BetaApi
public Pipeline update(Selectable... transformedFields) {
return append(new Update().withTransformedFields(transformedFields));
}

/**
* Performs an update operation using an {@link Update} stage.
*
* <p>This method allows you to use a pre-configured {@link Update} stage.
*
* <p>Example:
*
* <pre>{@code
* Update updateStage = new Update().withTransformedFields(constant("Updated").as("status"));
*
* firestore.pipeline()
* .collection("books")
* .where(field("title").equal("The Hitchhiker's Guide to the Galaxy"))
* .update(updateStage)
* .execute()
* .get();
* }</pre>
*
* @param update The {@code Update} stage to append.
* @return A new {@code Pipeline} object with this stage appended to the stage list.
*/
@BetaApi
public Pipeline update(Update update) {
return append(update);
}

/**
* Performs an insert operation using documents from previous stages. Adds a generic stage to the
* pipeline.
*
* <p>This method provides a flexible way to extend the pipeline's functionality by adding custom
* stages. Each generic stage is defined by a unique `name` and a set of `params` that control its
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
import com.google.cloud.firestore.pipeline.stages.CollectionOptions;
import com.google.cloud.firestore.pipeline.stages.Database;
import com.google.cloud.firestore.pipeline.stages.Documents;
import com.google.cloud.firestore.pipeline.stages.Literals;
import com.google.common.base.Preconditions;
import java.util.Arrays;
import javax.annotation.Nonnull;
Expand Down Expand Up @@ -157,6 +158,32 @@ public Pipeline documents(String... docs) {
.toArray(DocumentReference[]::new)));
}

/**
* Creates a new {@link Pipeline} that operates on a static set of documents represented as Maps.
*
* <p>Example:
*
* <pre>{@code
* Map<String, Object> doc1 = new HashMap<>();
* doc1.put("title", "Book 1");
* Map<String, Object> doc2 = new HashMap<>();
* doc2.put("title", "Book 2");
*
* Snapshot snapshot = firestore.pipeline()
* .literals(doc1, doc2)
* .execute()
* .get();
* }</pre>
*
* @param data The Maps representing documents to include in the pipeline.
* @return A new {@code Pipeline} instance with a literals source.
*/
@Nonnull
@BetaApi
public final Pipeline literals(java.util.Map<String, Object>... data) {
return new Pipeline(this.rpcContext, new Literals(data));
}

/**
* Creates a new {@link Pipeline} from the given {@link Query}. Under the hood, this will
* translate the query semantics (order by document ID, etc.) to an equivalent pipeline.
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
/*
* Copyright 2026 Google LLC
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package com.google.cloud.firestore.pipeline.stages;

import com.google.api.core.BetaApi;
import com.google.api.core.InternalApi;
import com.google.firestore.v1.Value;
import java.util.ArrayList;

@InternalApi
public final class Delete extends Stage {
private Delete(InternalOptions options) {
super("delete", options);
}

@BetaApi
public Delete() {
this(InternalOptions.EMPTY);
}

@Override
Iterable<Value> toStageArgs() {
return new ArrayList<>();
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
/*
* Copyright 2026 Google LLC
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package com.google.cloud.firestore.pipeline.stages;

import com.google.api.core.BetaApi;
import com.google.api.core.InternalApi;
import com.google.cloud.firestore.PipelineUtils;
import com.google.firestore.v1.Value;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Map;

@InternalApi
public final class Literals extends Stage {

private final List<Map<String, Object>> data;

@BetaApi
public Literals(Map<String, Object>... data) {
super("literals", InternalOptions.EMPTY);
this.data = Arrays.asList(data);
}

@Override
Iterable<Value> toStageArgs() {
List<Value> args = new ArrayList<>();
for (Map<String, Object> map : data) {
args.add(encodeLiteralMap(map));
}
return args;
}

private Value encodeLiteralMap(Map<?, ?> map) {
com.google.firestore.v1.MapValue.Builder mapValue =
com.google.firestore.v1.MapValue.newBuilder();
for (Map.Entry<?, ?> entry : map.entrySet()) {
String key = String.valueOf(entry.getKey());
Object v = entry.getValue();
if (v instanceof com.google.cloud.firestore.pipeline.expressions.Expression) {
mapValue.putFields(
key,
PipelineUtils.encodeValue(
(com.google.cloud.firestore.pipeline.expressions.Expression) v));
} else if (v instanceof Map) {
mapValue.putFields(key, encodeLiteralMap((Map<?, ?>) v));
} else {
mapValue.putFields(key, PipelineUtils.encodeValue(v));
}
}
return Value.newBuilder().setMapValue(mapValue.build()).build();
}
}
Loading
Loading