Skip to content

Conversation

@ngoiyaeric
Copy link
Collaborator

@ngoiyaeric ngoiyaeric commented Feb 1, 2026

User description

This change enables users to upload GeoJSON files (.geojson, application/geo+json) directly through the chat interface. Uploaded data is automatically rendered on the map (both Mapbox and Google Maps) and the viewport is adjusted to frame the data's extent using Turf.js.

The implementation includes:

  • A new uploadedGeoJson field in MapDataContext.
  • Server-side parsing and validation of GeoJSON in the submit action.
  • A headless MapDataUpdater component that manages state synchronization and "fly to" behavior.
  • Database persistence for GeoJSON layers, ensuring they reappear when a chat is reloaded.
  • Extension of the geospatialTool to allow AI agents to return GeoJSON that is processed through the same pipeline.
  • Improved testability with new data-testid attributes.

PR created automatically by Jules for task 11124993269366174034 started by @ngoiyaeric


PR Type

Enhancement


Description

  • Enable GeoJSON file uploads (.geojson, application/geo+json) through chat interface

  • Automatically render uploaded GeoJSON on maps with viewport framing using Turf.js

  • Persist GeoJSON layers in database for chat session reload capability

  • Allow AI agents to return GeoJSON through geospatial tool pipeline

  • Add MapDataUpdater component for state synchronization and map navigation


Diagram Walkthrough

flowchart LR
  A["User uploads GeoJSON file"] --> B["Parse and validate GeoJSON"]
  B --> C["MapDataUpdater component"]
  C --> D["Update MapData context"]
  D --> E["Render on Mapbox/Google Maps"]
  E --> F["Fit map bounds with Turf.js"]
  F --> G["Persist to database"]
  H["AI tool returns GeoJSON"] --> I["MapQueryHandler processes result"]
  I --> C
Loading

File Walkthrough

Relevant files
Enhancement
10 files
chat.ts
Add uploadedGeoJson field to context update                           
+1/-1     
actions.tsx
Implement GeoJSON file parsing and upload pipeline             
+60/-1   
chat-panel.tsx
Add GeoJSON file type to upload input accept                         
+2/-1     
chat.tsx
Sync uploadedGeoJson to server via updateDrawingContext   
+6/-5     
map-data-context.tsx
Add uploadedGeoJson array to MapData interface                     
+11/-1   
map-data-updater.tsx
Create headless component for GeoJSON context sync             
+63/-0   
map-query-handler.tsx
Integrate MapDataUpdater for tool GeoJSON results               
+13/-1   
mapbox-map.tsx
Render uploaded GeoJSON layers on Mapbox                                 
+4/-0     
google-map.tsx
Render uploaded GeoJSON layers on Google Maps                       
+3/-0     
geospatial.tsx
Extract and pass GeoJSON from geospatial tool results       
+28/-3   
Tests
2 files
message.tsx
Add data-testid attribute to bot message                                 
+1/-1     
user-message.tsx
Add data-testid attribute to user message                               
+1/-1     

Summary by CodeRabbit

  • New Features

    • Upload or paste GeoJSON (files or inline) and have it rendered as map layers.
    • Multiple uploaded GeoJSON items supported with visibility toggles; map auto-zooms to uploaded data.
    • GeoJSON returned by geospatial tools is automatically added to the map.
  • Bug Fixes

    • Improved detection and error handling for invalid GeoJSON inputs.
  • Tests

    • Added test IDs to file upload and message elements for easier automated testing.

✏️ Tip: You can customize this high-level summary in your review settings.

- Extend MapData context to support uploaded GeoJSON layers
- Update ChatPanel to support GeoJSON file selection
- Implement MapDataUpdater component for automatic context sync and map framing
- Update Mapbox and Google Maps components to render uploaded GeoJSON
- Enable AI tools to ingest GeoJSON into the map pipeline via MapQueryHandler
- Ensure persistence of GeoJSON data across chat sessions via database sync
- Add test IDs to key components for improved observability

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
@vercel
Copy link

vercel bot commented Feb 1, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
qcx Ready Ready Preview, Comment Feb 1, 2026 7:19pm

@google-labs-jules
Copy link
Contributor

👋 Jules, reporting for duty! I'm here to lend a hand with this pull request.

When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down.

I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job!

For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with @jules. You can find this option in the Pull Request section of your global Jules UI settings. You can always switch back!

New to Jules? Learn more at jules.google/docs.


For security, I will only act on instructions from the user who triggered this task.

@charliecreates charliecreates bot requested a review from CharlieHelps February 1, 2026 16:57
@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 1, 2026

Walkthrough

Adds end-to-end GeoJSON handling: detect inline or uploaded GeoJSON on submit, emit a geojson_upload AI message, stream a headless MapDataUpdater to inject entries into mapData.uploadedGeoJson, and render those GeoJSON layers in Mapbox/Google maps; server-side drawing-context persists uploadedGeoJson.

Changes

Cohort / File(s) Summary
File submit & AI state
app/actions.tsx, components/chat-panel.tsx
Detect and parse inline GeoJSON and .geojson/application/geo+json uploads, push geojson_upload AI message, stream a MapDataUpdater UI message; added data-testid to file input.
Chat sync / drawing context
components/chat.tsx, lib/actions/chat.ts
Include uploadedGeoJson in drawing-context payload; add snapshot/dedup guards to avoid redundant updates. updateDrawingContext signature now accepts optional uploadedGeoJson.
Map data model & updater
components/map/map-data-context.tsx, components/map/map-data-updater.tsx
Add uploadedGeoJson array to MapData; new MapDataUpdater component normalizes GeoJSON, appends unique uploaded entries (id, filename, data, visible), and fits map bounds once.
Map rendering layers
components/map/google-map.tsx, components/map/mapbox-map.tsx
Render additional GeoJSON layers for each visible item in mapData.uploadedGeoJson (GoogleGeoJsonLayer / GeoJsonLayer).
Tooling & parsing
lib/agents/tools/geospatial.tsx, components/map/map-query-handler.tsx
Surface geoJson from MCP/tool responses (added optional geoJson to McpResponse); render MapDataUpdater when tool results include GeoJSON.
DB schema & persistence
drizzle/migrations/..., lib/db/schema.ts, lib/actions/chat-db.ts
Add calendar_notes table, add attachments, tool_name, tool_call_id, type to messages (migration + schema), and change message insert to upsert to deduplicate on conflict.
Types & test attributes
lib/types/index.ts, components/message.tsx, components/user-message.tsx
Add geojson_upload to AIMessage.type; add data-testid attributes to message components for testing.
Deps / snapshots
package.json, drizzle/migrations/meta/*
Bump drizzle-orm version and add/update migration snapshot and journal metadata files.

Sequence Diagram

sequenceDiagram
    actor User
    participant ChatPanel as Chat Panel
    participant Actions as app/actions
    participant AIState as AI State
    participant MapUpdater as MapDataUpdater
    participant MapContext as MapData Context
    participant MapRenderer as Map Components

    User->>ChatPanel: paste or upload GeoJSON
    ChatPanel->>Actions: submit(text/file)
    Actions->>Actions: detect & parse GeoJSON
    Actions->>AIState: push 'geojson_upload' message (id, filename, data)
    Actions->>MapUpdater: stream MapDataUpdater component
    MapUpdater->>MapContext: add uploadedGeoJson entry {id, filename, data, visible:true}
    MapUpdater->>MapUpdater: compute bbox (turf.bbox)
    MapUpdater->>MapRenderer: call fitBounds(bbox)
    MapRenderer->>User: render GeoJsonLayer(s)
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

Suggested labels

Backend

Poem

🐇 I found a GeoJSON neat and round,

I nudged its features into town,
Layers popped where bounds took flight,
The map blinked on — what a sight,
Hooray, a rabbit’s tidy ground!

🚥 Pre-merge checks | ✅ 2 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 42.86% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'GeoJSON Upload and Tool Ingestion Pipeline' directly and clearly summarizes the main purpose of the changeset, which introduces end-to-end GeoJSON support through uploads, pastes, and AI tool integration.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feature/geojson-upload-pipeline-11124993269366174034

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@qodo-code-review
Copy link
Contributor

qodo-code-review bot commented Feb 1, 2026

PR Compliance Guide 🔍

Below is a summary of compliance checks for this PR:

Security Compliance
Denial of service

Description: Unbounded GeoJSON ingestion (Buffer -> JSON.parse) on user-uploaded files without
enforcing file size/feature-count limits can be exploited to cause server memory/CPU
exhaustion (e.g., uploading a very large .geojson or deeply nested JSON).
actions.tsx [319-350]

Referred Code
} else if (file.type === 'text/plain' || file.name.endsWith('.geojson') || file.type === 'application/geo+json') {
  const textContent = Buffer.from(buffer).toString('utf-8')
  const isGeoJson = file.name.endsWith('.geojson') || file.type === 'application/geo+json'

  if (isGeoJson) {
    try {
      const geoJson = JSON.parse(textContent)
      if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
        const geoJsonId = nanoid()
        // Add a special message to track the GeoJSON upload
        aiState.update({
          ...aiState.get(),
          messages: [
            ...aiState.get().messages,
            {
              id: geoJsonId,
              role: 'assistant',
              content: JSON.stringify({ data: geoJson, filename: file.name }),
              type: 'geojson_upload'
            }
          ]


 ... (clipped 11 lines)
Client-side DoS

Description: Rendering and processing untrusted GeoJSON on the client (notably
turf.bbox(featureCollection) and map.fitBounds) without complexity limits can freeze the
UI or crash the tab when given extremely large/complex GeoJSON from uploads or tool
output.
map-data-updater.tsx [19-60]

Referred Code
useEffect(() => {
  if (!data) return;

  // Ensure it's a FeatureCollection for consistency
  const featureCollection: FeatureCollection = data.type === 'FeatureCollection'
    ? data
    : { type: 'FeatureCollection', features: [data] };

  // Update MapData context
  setMapData(prev => {
    // Avoid duplicate entries
    const alreadyExists = prev.uploadedGeoJson?.some(item => item.id === id);
    if (alreadyExists) return prev;

    return {
      ...prev,
      uploadedGeoJson: [
        ...(prev.uploadedGeoJson || []),
        {
          id,
          filename,


 ... (clipped 21 lines)
Ticket Compliance
🎫 No ticket provided
  • Create ticket/issue
Codebase Duplication Compliance
Codebase context is not defined

Follow the guide to enable codebase context checks.

Custom Compliance
🟢
Generic: Secure Error Handling

Objective: To prevent the leakage of sensitive system information through error messages while
providing sufficient detail for internal debugging.

Status: Passed

Learn more about managing compliance generic rules or creating your own custom rules

🔴
Generic: Comprehensive Audit Trails

Objective: To create a detailed and reliable record of critical system actions for security analysis
and compliance.

Status:
Missing audit logging: The GeoJSON upload/ingestion path adds and persists user-supplied GeoJSON but does not
emit a structured audit log including user ID, action description, and outcome to
reconstruct who uploaded what and whether it succeeded.

Referred Code
} else if (file.type === 'text/plain' || file.name.endsWith('.geojson') || file.type === 'application/geo+json') {
  const textContent = Buffer.from(buffer).toString('utf-8')
  const isGeoJson = file.name.endsWith('.geojson') || file.type === 'application/geo+json'

  if (isGeoJson) {
    try {
      const geoJson = JSON.parse(textContent)
      if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
        const geoJsonId = nanoid()
        // Add a special message to track the GeoJSON upload
        aiState.update({
          ...aiState.get(),
          messages: [
            ...aiState.get().messages,
            {
              id: geoJsonId,
              role: 'assistant',
              content: JSON.stringify({ data: geoJson, filename: file.name }),
              type: 'geojson_upload'
            }
          ]


 ... (clipped 11 lines)

Learn more about managing compliance generic rules or creating your own custom rules

Generic: Robust Error Handling and Edge Case Management

Objective: Ensure comprehensive error handling that provides meaningful context and graceful
degradation

Status:
Silent parse failure: GeoJSON parsing failures are only logged via console.error and the request continues
without surfacing an actionable error to the user or recording a failure outcome, making
troubleshooting and UX degraded.

Referred Code
if (isGeoJson) {
  try {
    const geoJson = JSON.parse(textContent)
    if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
      const geoJsonId = nanoid()
      // Add a special message to track the GeoJSON upload
      aiState.update({
        ...aiState.get(),
        messages: [
          ...aiState.get().messages,
          {
            id: geoJsonId,
            role: 'assistant',
            content: JSON.stringify({ data: geoJson, filename: file.name }),
            type: 'geojson_upload'
          }
        ]
      })

      // Immediately append the updater to the UI stream
      uiStream.append(


 ... (clipped 6 lines)

Learn more about managing compliance generic rules or creating your own custom rules

Generic: Secure Logging Practices

Objective: To ensure logs are useful for debugging and auditing without exposing sensitive
information like PII, PHI, or cardholder data.

Status:
Logs raw tool data: The tool logs potentially sensitive unredacted content and echoes params via
JSON.stringify(params) which may include user-provided locations/addresses or other
sensitive details in unstructured console logs.

Referred Code
try { content = JSON.parse(content); }
catch { console.warn('[GeospatialTool] Content is not JSON, using as string:', content); }

// Process results
if (typeof content === 'object' && content !== null) {
  const parsedData = content as any;
  if (parsedData.results?.length > 0) {
    const firstResult = parsedData.results[0];
    mcpData = {
      location: {
        latitude: firstResult.coordinates?.latitude,
        longitude: firstResult.coordinates?.longitude,
        place_name: firstResult.name || firstResult.place_name,
        address: firstResult.full_address || firstResult.address
      },
      mapUrl: parsedData.mapUrl,
      geoJson: parsedData.geoJson || parsedData.geojson || firstResult.geoJson || firstResult.geojson
    };
  } else if (parsedData.location) {
    mcpData = {
      location: {


 ... (clipped 20 lines)

Learn more about managing compliance generic rules or creating your own custom rules

Generic: Security-First Input Validation and Data Handling

Objective: Ensure all data inputs are validated, sanitized, and handled securely to prevent
vulnerabilities

Status:
Weak GeoJSON validation: Uploaded GeoJSON is accepted after only a shallow type check (Feature/FeatureCollection)
without schema validation, size/feature-count limits, or sanitization, increasing risk of
malformed/hostile input affecting storage and rendering.

Referred Code
} else if (file.type === 'text/plain' || file.name.endsWith('.geojson') || file.type === 'application/geo+json') {
  const textContent = Buffer.from(buffer).toString('utf-8')
  const isGeoJson = file.name.endsWith('.geojson') || file.type === 'application/geo+json'

  if (isGeoJson) {
    try {
      const geoJson = JSON.parse(textContent)
      if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
        const geoJsonId = nanoid()
        // Add a special message to track the GeoJSON upload
        aiState.update({
          ...aiState.get(),
          messages: [
            ...aiState.get().messages,
            {
              id: geoJsonId,
              role: 'assistant',
              content: JSON.stringify({ data: geoJson, filename: file.name }),
              type: 'geojson_upload'
            }
          ]


 ... (clipped 7 lines)

Learn more about managing compliance generic rules or creating your own custom rules

Generic: Meaningful Naming and Self-Documenting Code

Objective: Ensure all identifiers clearly express their purpose and intent, making code
self-documenting

Status:
Overuse of any: New GeoJSON pipeline types use broad any (e.g., data: any, uploadedGeoJson entries with
data: any) which reduces self-documentation and makes intent and constraints unclear.

Referred Code
interface MapDataUpdaterProps {
  id: string;
  data: any; // FeatureCollection or Feature
  filename: string;
}

Learn more about managing compliance generic rules or creating your own custom rules

  • Update
Compliance status legend 🟢 - Fully Compliant
🟡 - Partial Compliant
🔴 - Not Compliant
⚪ - Requires Further Human Verification
🏷️ - Compliance label

@qodo-code-review
Copy link
Contributor

qodo-code-review bot commented Feb 1, 2026

PR Code Suggestions ✨

Explore these optional code suggestions:

CategorySuggestion                                                                                                                                    Impact
Possible issue
Prevent repeated map view updates
Suggestion Impact:The commit adds a hasZoomedRef guard so map.fitBounds() runs only once, preventing repeated map view updates, though it uses a ref-based approach instead of checking existing GeoJSON in context.

code diff:

-import { useEffect } from 'react';
+import { useEffect, useRef } from 'react';
 import { useMapData } from './map-data-context';
 import { useMap } from './map-context';
 import type { FeatureCollection } from 'geojson';
@@ -15,6 +15,7 @@
 export function MapDataUpdater({ id, data, filename }: MapDataUpdaterProps) {
   const { setMapData } = useMapData();
   const { map } = useMap();
+  const hasZoomedRef = useRef(false);
 
   useEffect(() => {
     if (!data) return;
@@ -45,7 +46,8 @@
     });
 
     // Fly to the extent of the GeoJSON
-    if (map && featureCollection.features.length > 0) {
+    if (map && featureCollection.features.length > 0 && !hasZoomedRef.current) {
+      hasZoomedRef.current = true;
       try {
         const bbox = turf.bbox(featureCollection);
         map.fitBounds(bbox as [number, number, number, number], {

In MapDataUpdater, prevent the useEffect from repeatedly calling map.fitBounds()
by checking if the GeoJSON data has already been processed and exiting early.

components/map/map-data-updater.tsx [15-63]

 export function MapDataUpdater({ id, data, filename }: MapDataUpdaterProps) {
-  const { setMapData } = useMapData();
+  const { mapData, setMapData } = useMapData();
   const { map } = useMap();
 
   useEffect(() => {
     if (!data) return;
+
+    // Prevent re-processing if the data is already in the context.
+    const alreadyExists = mapData.uploadedGeoJson?.some(item => item.id === id);
+    if (alreadyExists) {
+      return;
+    }
 
     // Ensure it's a FeatureCollection for consistency
     const featureCollection: FeatureCollection = data.type === 'FeatureCollection'
       ? data
       : { type: 'FeatureCollection', features: [data] };
 
     // Update MapData context
-    setMapData(prev => {
-      // Avoid duplicate entries
-      const alreadyExists = prev.uploadedGeoJson?.some(item => item.id === id);
-      if (alreadyExists) return prev;
+    setMapData(prev => ({
+      ...prev,
+      uploadedGeoJson: [
+        ...(prev.uploadedGeoJson || []),
+        {
+          id,
+          filename,
+          data: featureCollection,
+          visible: true
+        }
+      ]
+    }));
 
-      return {
-        ...prev,
-        uploadedGeoJson: [
-          ...(prev.uploadedGeoJson || []),
-          {
-            id,
-            filename,
-            data: featureCollection,
-            visible: true
-          }
-        ]
-      };
-    });
-
-    // Fly to the extent of the GeoJSON
+    // Fly to the extent of the GeoJSON only once
     if (map && featureCollection.features.length > 0) {
       try {
         const bbox = turf.bbox(featureCollection);
         map.fitBounds(bbox as [number, number, number, number], {
           padding: 50,
           maxZoom: 15,
           duration: 2000
         });
       } catch (e) {
         console.error('Failed to fit bounds for GeoJSON:', e);
       }
     }
-  }, [id, data, filename, setMapData, map]);
+  }, [id, data, filename, setMapData, map, mapData.uploadedGeoJson]);
 
   return null; // Headless component
 }

[Suggestion processed]

Suggestion importance[1-10]: 8

__

Why: The suggestion correctly identifies that map.fitBounds() is called on every render, causing a poor user experience, and provides a correct fix to run the effect only once per GeoJSON upload.

Medium
Safely access nested response properties

In lib/agents/tools/geospatial.tsx, use optional chaining (?.) instead of
logical OR (||) to safely access nested geoJson properties from the API
response, preventing potential runtime errors.

lib/agents/tools/geospatial.tsx [393]

-geoJson: parsedData.geoJson || parsedData.geojson || firstResult.geoJson || firstResult.geojson
+geoJson: parsedData.geoJson ?? parsedData.geojson ?? firstResult?.geoJson ?? firstResult?.geojson
  • Apply / Chat
Suggestion importance[1-10]: 6

__

Why: The suggestion correctly proposes using optional chaining (?.) for safer property access, which improves code robustness against potentially missing nested properties in the API response.

Low
General
Provide feedback for invalid uploads

In app/actions.tsx, provide user feedback for invalid GeoJSON uploads by
appending a BotMessage to the uiStream within the catch block and for files that
are not valid GeoJSON features.

app/actions.tsx [323-350]

 if (isGeoJson) {
   try {
     const geoJson = JSON.parse(textContent)
     if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
       const geoJsonId = nanoid()
       // Add a special message to track the GeoJSON upload
       aiState.update({
         ...aiState.get(),
         messages: [
           ...aiState.get().messages,
           {
             id: geoJsonId,
             role: 'assistant',
             content: JSON.stringify({ data: geoJson, filename: file.name }),
             type: 'geojson_upload'
           }
         ]
       })
 
       // Immediately append the updater to the UI stream
       uiStream.append(
         <MapDataUpdater id={geoJsonId} data={geoJson} filename={file.name} />
       )
+    } else {
+      // Handle cases where JSON is valid but not a GeoJSON Feature or FeatureCollection
+      uiStream.append(<BotMessage content="The uploaded file is not a valid GeoJSON Feature or FeatureCollection." />)
     }
   } catch (e) {
     console.error('Failed to parse GeoJSON:', e)
+    uiStream.append(<BotMessage content="The uploaded file could not be parsed as valid GeoJSON." />)
   }
 }

[To ensure code accuracy, apply this suggestion manually]

Suggestion importance[1-10]: 8

__

Why: The suggestion improves user experience by providing feedback on invalid file uploads, which is a crucial part of error handling for this new feature.

Medium
Use case-insensitive extension check

In app/actions.tsx, make the GeoJSON file detection case-insensitive by
converting file.name to lowercase before checking if it ends with .geojson.

app/actions.tsx [319-321]

-} else if (file.type === 'text/plain' || file.name.endsWith('.geojson') || file.type === 'application/geo+json') {
+} else if (
+  file.type === 'text/plain' ||
+  file.name.toLowerCase().endsWith('.geojson') ||
+  file.type === 'application/geo+json'
+) {
   const textContent = Buffer.from(buffer).toString('utf-8')
-  const isGeoJson = file.name.endsWith('.geojson') || file.type === 'application/geo+json'
+  const isGeoJson =
+    file.name.toLowerCase().endsWith('.geojson') ||
+    file.type === 'application/geo+json'
  • Apply / Chat
Suggestion importance[1-10]: 5

__

Why: The suggestion makes the file type detection more robust by handling case-insensitive file extensions, which is a good practice for file handling.

Low
Filter items before rendering

In GoogleMapComponent, refactor the rendering of uploadedGeoJson to use
.filter(item => item.visible) before .map() to avoid returning boolean values in
the render array.

components/map/google-map.tsx [84-86]

-{mapData.uploadedGeoJson?.map(item => (
-  item.visible && <GoogleGeoJsonLayer key={item.id} data={item.data} />
-))}
+{mapData.uploadedGeoJson
+  ?.filter(item => item.visible)
+  .map(item => (
+    <GoogleGeoJsonLayer key={item.id} data={item.data} />
+  ))}
  • Apply / Chat
Suggestion importance[1-10]: 4

__

Why: The suggestion improves code style and robustness by using .filter() before .map(), which is a clearer and safer pattern for conditional rendering in React than using &&.

Low
  • Update

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 10

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
lib/actions/chat.ts (1)

211-215: ⚠️ Potential issue | 🟡 Minor

Redis client instantiation may fail silently or cause runtime errors.

The Redis client is instantiated at module load time with potentially empty environment variables. If UPSTASH_REDIS_REST_URL is not configured, this will create a client with an empty URL that will fail on first use. Consider:

  1. Lazy initialization (create client only when needed)
  2. Guard the instantiation with environment variable checks
  3. Or remove if Redis functionality is being deprecated (as suggested by the TODO comments)
🛡️ Suggested fix with lazy initialization
-// `@ts-ignore` - Ignoring Redis import error for now as it might be removed or replaced
-import { Redis } from '@upstash/redis'; // This will cause issues if REDIS_URL is not configured.
-const redis = new Redis({
-  url: process.env.UPSTASH_REDIS_REST_URL?.trim() || '',
-  token: process.env.UPSTASH_REDIS_REST_TOKEN || ''
-});
+import { Redis } from '@upstash/redis';
+
+let _redis: Redis | null = null;
+function getRedis(): Redis {
+  if (!_redis) {
+    const url = process.env.UPSTASH_REDIS_REST_URL?.trim();
+    const token = process.env.UPSTASH_REDIS_REST_TOKEN;
+    if (!url || !token) {
+      throw new Error('Redis configuration missing: UPSTASH_REDIS_REST_URL and UPSTASH_REDIS_REST_TOKEN are required');
+    }
+    _redis = new Redis({ url, token });
+  }
+  return _redis;
+}

Then update usages:

-    await redis.set(`system_prompt:${userId}`, prompt)
+    await getRedis().set(`system_prompt:${userId}`, prompt)
🤖 Fix all issues with AI agents
In `@app/actions.tsx`:
- Around line 326-340: The current branch that creates a geojson upload message
(around geoJson/type check, nanoid generation and aiState.update that pushes a
message with type 'geojson_upload' and content from file.name) should validate
the GeoJSON more thoroughly before updating aiState: implement or call a
lightweight validator (e.g., a new validateGeoJSON function or an
is-geojson/geojson-validation library) to verify FeatureCollection.features is
an array and each Feature has valid properties (type==='Feature', geometry with
type and coordinates) and that FeatureCollection has valid bbox/crs if present;
if validation fails, avoid pushing the message, log the error (or surface a
user-facing error) and return early. Ensure you reference/replace the simple
geoJson.type check with this validator in the block that generates geoJsonId and
calls aiState.update so only validated GeoJSON gets stored as type
'geojson_upload'.
- Around line 319-350: The GeoJSON parse failure is only console.logged and the
raw file content may still be prepended to messages; modify the catch block
around JSON.parse(textContent) to (1) prevent adding the MapDataUpdater/uiStream
entry or the geojson_upload assistant message when parsing fails and (2) surface
a user-facing error via aiState.update (add an assistant or system message with
id like geoJsonId or similar explaining the file failed to parse) and optionally
append a UI error component instead of MapDataUpdater; reference the existing
symbols JSON.parse, geoJsonId, aiState.update, uiStream.append, and
MapDataUpdater to locate and change behavior.

In `@components/chat.tsx`:
- Around line 92-102: The effect's condition always becomes true once
cameraState exists because arrays are truthy; change the condition in the
useEffect that calls updateDrawingContext so it only runs when id exists,
cameraState exists, and at least one of mapData.drawnFeatures or
mapData.uploadedGeoJson contains items (check .length > 0 or truthy length via
optional chaining), e.g., use mapData.drawnFeatures?.length > 0 ||
mapData.uploadedGeoJson?.length > 0 before calling updateDrawingContext(id,
...); keep the same payload shape and dependency array ([id,
mapData.drawnFeatures, mapData.cameraState, mapData.uploadedGeoJson]).

In `@components/map/google-map.tsx`:
- Around line 84-86: The map callback currently uses a logical && which can push
false values into the JSX array; update the rendering of mapData.uploadedGeoJson
so you either filter out invisible items first (e.g.,
uploadedGeoJson.filter(item => item.visible).map(item => <GoogleGeoJsonLayer
key={item.id} data={item.data} />)) or use a ternary to return null for
invisible items (item.visible ? <GoogleGeoJsonLayer key={item.id}
data={item.data} /> : null); apply this change where mapData.uploadedGeoJson is
iterated to ensure only valid React nodes (no false values) are returned.

In `@components/map/map-data-context.tsx`:
- Around line 32-37: The uploadedGeoJson entry uses data: any which loses type
safety; update the type in map-data-context.tsx to use GeoJSON's
FeatureCollection (import FeatureCollection from 'geojson' or the project's
GeoJSON types) so uploadedGeoJson?: Array<{ id: string; filename: string; data:
FeatureCollection; visible: boolean; }>; reference the existing
GeoJsonLayerProps in geojson-layer.tsx to ensure the FeatureCollection generic
matches the layer's expected geometry/props types and adjust imports
accordingly.

In `@components/map/map-data-updater.tsx`:
- Around line 9-13: The MapDataUpdaterProps interface uses a loose any for the
data prop; change it to an explicit GeoJSON union (e.g.,
GeoJSON.FeatureCollection | GeoJSON.Feature) by importing the GeoJSON types and
replacing the data: any declaration in MapDataUpdaterProps with the appropriate
union type; also update any usages of the data prop (in the MapDataUpdater
component and callers) to satisfy the stronger type (narrow with type guards
where you expect Feature vs FeatureCollection).
- Around line 22-25: The normalization blindly treats non-FeatureCollection
inputs as Features and can produce invalid GeoJSON; update the logic around the
featureCollection variable to validate the incoming data.type: if data.type ===
'FeatureCollection' keep it, if data.type === 'Feature' use it directly, if
data.type is a GeoJSON Geometry type (e.g., 'Point','Polygon','LineString',
etc.) wrap it in a Feature object ({ type: 'Feature', geometry: data,
properties: {} }), and otherwise handle invalid input by logging an error via
the same logger and returning or falling back to an empty FeatureCollection ({
type: 'FeatureCollection', features: [] }) so downstream consumers only ever
receive a valid FeatureCollection of Feature objects. Ensure this uses the same
variable names (featureCollection, data) so the change is easy to locate.

In `@lib/actions/chat.ts`:
- Around line 165-166: Remove the redundant inner 'use server' directive inside
the updateDrawingContext function; the module already declares 'use server' at
the top, so delete the string literal within the updateDrawingContext function
body (leave the top-level directive intact) to clean up the function declaration
and avoid a no-op directive.

In `@lib/agents/tools/geospatial.tsx`:
- Around line 406-411: When parsedData is a GeoJSON Feature or FeatureCollection
in the geospatial tool, do not set mcpData.location to an empty object because
MapQueryHandler (and its latitude/longitude checks and targetPosition handling)
treats empty objects as invalid; instead either derive a centroid from
parsedData.bbox and populate mcpData.location.latitude and
mcpData.location.longitude, or set mcpData.location to null/undefined to signal
that location should be derived from the GeoJSON later; update the branch
handling parsedData.type === 'FeatureCollection' || 'Feature' to compute a
centroid from parsedData.bbox (or leave location null) and attach geoJson to
mcpData so MapQueryHandler can process it correctly.
📜 Review details

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 250283e and 1b56819.

📒 Files selected for processing (12)
  • app/actions.tsx
  • components/chat-panel.tsx
  • components/chat.tsx
  • components/map/google-map.tsx
  • components/map/map-data-context.tsx
  • components/map/map-data-updater.tsx
  • components/map/map-query-handler.tsx
  • components/map/mapbox-map.tsx
  • components/message.tsx
  • components/user-message.tsx
  • lib/actions/chat.ts
  • lib/agents/tools/geospatial.tsx
🧰 Additional context used
🧬 Code graph analysis (10)
lib/actions/chat.ts (2)
lib/actions/chat-db.ts (1)
  • msg (117-121)
components/sidebar/chat-history-client.tsx (1)
  • ChatHistoryClientProps (24-26)
components/user-message.tsx (2)
tests/chat.spec.ts (2)
  • markdownMessage (46-62)
  • test (4-89)
components/header-search-button.tsx (1)
  • currentMessages (51-57)
components/map/map-data-context.tsx (2)
components/map/google-geojson-layer.tsx (4)
  • map (14-57)
  • GoogleGeoJsonLayerProps (6-8)
  • GoogleGeoJsonLayer (10-60)
  • feature (27-44)
components/map/geojson-layer.tsx (2)
  • map (24-80)
  • GeoJsonLayerProps (8-11)
components/chat.tsx (1)
lib/actions/chat.ts (1)
  • updateDrawingContext (165-205)
app/actions.tsx (3)
components/map/map-data-updater.tsx (1)
  • MapDataUpdater (15-63)
components/map/google-geojson-layer.tsx (2)
  • map (14-57)
  • feature (27-44)
components/map/geojson-layer.tsx (3)
  • map (16-97)
  • GeoJsonLayer (13-100)
  • map (24-80)
components/message.tsx (1)
tests/chat.spec.ts (1)
  • markdownMessage (46-62)
components/map/map-query-handler.tsx (1)
components/map/map-data-updater.tsx (1)
  • MapDataUpdater (15-63)
components/map/map-data-updater.tsx (3)
components/map/map-data-context.tsx (1)
  • useMapData (61-67)
components/map/map-context.tsx (1)
  • useMap (24-30)
components/map/geojson-layer.tsx (1)
  • map (16-97)
components/map/google-map.tsx (1)
components/map/google-geojson-layer.tsx (3)
  • GoogleGeoJsonLayer (10-60)
  • feature (27-44)
  • newLayer (52-56)
lib/agents/tools/geospatial.tsx (1)
mapbox_mcp/hooks.ts (3)
  • LocationResult (12-20)
  • DistanceResult (22-29)
  • PlaceResult (31-39)
🔇 Additional comments (10)
components/user-message.tsx (1)

35-35: LGTM!

The data-testid attribute addition aligns with the test selectors in tests/chat.spec.ts and enables proper test automation without affecting component behavior.

components/message.tsx (1)

21-21: LGTM!

The data-testid attribute addition matches the test selectors in tests/chat.spec.ts and supports automated testing.

components/chat-panel.tsx (1)

216-217: LGTM!

The file input changes correctly support GeoJSON uploads:

  • Both the .geojson extension and application/geo+json MIME type are accepted
  • The data-testid attribute enables test automation

GeoJSON files will be shown in the attachment preview (lines 294-305) and sent via FormData for server-side processing.

components/map/map-data-context.tsx (1)

48-52: LGTM!

Initializing the arrays explicitly in the default state prevents potential undefined access errors when components try to map over these arrays.

lib/actions/chat.ts (1)

180-180: Remove the unnecessary cast and update the schema comment to reflect actual supported roles.

The 'data' role is already defined in the AIMessage type and is actively handled in the codebase (see app/actions.tsx:817). The database schema uses varchar(50), which accepts any string. The cast is redundant since 'data' is already part of the union type. The actual issue is that the schema comment at lib/db/schema.ts:30 is outdated—it lists only 'user', 'assistant', 'system', 'tool' but should include 'data' and 'function' to match the actual AIMessage type definition.

components/map/mapbox-map.tsx (1)

573-575: No issues found—GeoJsonLayer is correctly within MapContext provider scope.

The GeoJsonLayer components at lines 573-575 are rendered within the Mapbox component, which successfully uses the useMap() hook at line 23. Since Mapbox calls useMap() without errors, the MapContextProvider (imported in app/layout.tsx line 18) definitively wraps the entire component tree. The GeoJsonLayer instances, being rendered within Mapbox's JSX, are therefore within provider scope and can safely call useMap() to access the map instance. The code is correct.

components/map/map-query-handler.tsx (1)

81-91: LGTM!

The conditional rendering of MapDataUpdater when GeoJSON is available is well-implemented. The use of toolOutput.timestamp as the unique identifier and the fallback filename logic are appropriate.

lib/agents/tools/geospatial.tsx (1)

385-394: LGTM on GeoJSON extraction from results.

The extraction logic correctly handles both geoJson and geojson property name variants at both the response level and individual result level. This defensive approach handles inconsistent API responses well.

app/actions.tsx (2)

751-757: LGTM!

The geojson_upload message type handler correctly reconstructs the MapDataUpdater component from persisted state, enabling GeoJSON layers to reappear when chats are reloaded.


817-836: LGTM on data message handling.

The handling of data messages with uploadedGeoJson arrays correctly iterates and renders MapDataUpdater components for each item. Error handling is appropriate.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

Comment on lines +319 to +350
} else if (file.type === 'text/plain' || file.name.endsWith('.geojson') || file.type === 'application/geo+json') {
const textContent = Buffer.from(buffer).toString('utf-8')
const isGeoJson = file.name.endsWith('.geojson') || file.type === 'application/geo+json'

if (isGeoJson) {
try {
const geoJson = JSON.parse(textContent)
if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
const geoJsonId = nanoid()
// Add a special message to track the GeoJSON upload
aiState.update({
...aiState.get(),
messages: [
...aiState.get().messages,
{
id: geoJsonId,
role: 'assistant',
content: JSON.stringify({ data: geoJson, filename: file.name }),
type: 'geojson_upload'
}
]
})

// Immediately append the updater to the UI stream
uiStream.append(
<MapDataUpdater id={geoJsonId} data={geoJson} filename={file.name} />
)
}
} catch (e) {
console.error('Failed to parse GeoJSON:', e)
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Silent error handling may confuse users when GeoJSON parsing fails.

When JSON.parse(textContent) fails on Line 325, the error is only logged to the console. The file content still gets prepended to the message (lines 352-357), which may lead to confusing AI responses. Consider providing user feedback:

🛡️ Proposed improvement
         } catch (e) {
           console.error('Failed to parse GeoJSON:', e)
+          // Optionally notify the user
+          uiStream.append(
+            <BotMessage content={createStreamableValue(`⚠️ Failed to parse ${file.name} as valid GeoJSON. The file will be processed as plain text.`).value} />
+          )
         }
🤖 Prompt for AI Agents
In `@app/actions.tsx` around lines 319 - 350, The GeoJSON parse failure is only
console.logged and the raw file content may still be prepended to messages;
modify the catch block around JSON.parse(textContent) to (1) prevent adding the
MapDataUpdater/uiStream entry or the geojson_upload assistant message when
parsing fails and (2) surface a user-facing error via aiState.update (add an
assistant or system message with id like geoJsonId or similar explaining the
file failed to parse) and optionally append a UI error component instead of
MapDataUpdater; reference the existing symbols JSON.parse, geoJsonId,
aiState.update, uiStream.append, and MapDataUpdater to locate and change
behavior.

Comment on lines +326 to +340
if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
const geoJsonId = nanoid()
// Add a special message to track the GeoJSON upload
aiState.update({
...aiState.get(),
messages: [
...aiState.get().messages,
{
id: geoJsonId,
role: 'assistant',
content: JSON.stringify({ data: geoJson, filename: file.name }),
type: 'geojson_upload'
}
]
})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Consider validating GeoJSON structure more thoroughly.

The current validation only checks geoJson.type. Invalid GeoJSON (e.g., a FeatureCollection with malformed features) could still pass through and cause rendering issues downstream. Consider using a lightweight validation:

🛡️ Optional: Add feature validation
           if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
+            // Basic validation for FeatureCollection
+            if (geoJson.type === 'FeatureCollection' && !Array.isArray(geoJson.features)) {
+              console.warn('Invalid FeatureCollection: missing features array')
+              return
+            }
             const geoJsonId = nanoid()
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
const geoJsonId = nanoid()
// Add a special message to track the GeoJSON upload
aiState.update({
...aiState.get(),
messages: [
...aiState.get().messages,
{
id: geoJsonId,
role: 'assistant',
content: JSON.stringify({ data: geoJson, filename: file.name }),
type: 'geojson_upload'
}
]
})
if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
// Basic validation for FeatureCollection
if (geoJson.type === 'FeatureCollection' && !Array.isArray(geoJson.features)) {
console.warn('Invalid FeatureCollection: missing features array')
return
}
const geoJsonId = nanoid()
// Add a special message to track the GeoJSON upload
aiState.update({
...aiState.get(),
messages: [
...aiState.get().messages,
{
id: geoJsonId,
role: 'assistant',
content: JSON.stringify({ data: geoJson, filename: file.name }),
type: 'geojson_upload'
}
]
})
🤖 Prompt for AI Agents
In `@app/actions.tsx` around lines 326 - 340, The current branch that creates a
geojson upload message (around geoJson/type check, nanoid generation and
aiState.update that pushes a message with type 'geojson_upload' and content from
file.name) should validate the GeoJSON more thoroughly before updating aiState:
implement or call a lightweight validator (e.g., a new validateGeoJSON function
or an is-geojson/geojson-validation library) to verify
FeatureCollection.features is an array and each Feature has valid properties
(type==='Feature', geometry with type and coordinates) and that
FeatureCollection has valid bbox/crs if present; if validation fails, avoid
pushing the message, log the error (or surface a user-facing error) and return
early. Ensure you reference/replace the simple geoJson.type check with this
validator in the block that generates geoJsonId and calls aiState.update so only
validated GeoJSON gets stored as type 'geojson_upload'.

Comment on lines 92 to 102
// useEffect to call the server action when drawnFeatures or uploadedGeoJson changes
useEffect(() => {
if (id && mapData.drawnFeatures && mapData.cameraState) {
console.log('Chat.tsx: drawnFeatures changed, calling updateDrawingContext', mapData.drawnFeatures);
if (id && (mapData.drawnFeatures || mapData.uploadedGeoJson) && mapData.cameraState) {
console.log('Chat.tsx: map data changed, calling updateDrawingContext');
updateDrawingContext(id, {
drawnFeatures: mapData.drawnFeatures,
drawnFeatures: mapData.drawnFeatures || [],
cameraState: mapData.cameraState,
uploadedGeoJson: mapData.uploadedGeoJson || []
});
}
}, [id, mapData.drawnFeatures, mapData.cameraState]);
}, [id, mapData.drawnFeatures, mapData.cameraState, mapData.uploadedGeoJson]);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Condition may trigger excessive database writes.

The condition (mapData.drawnFeatures || mapData.uploadedGeoJson) && mapData.cameraState will always evaluate to true once cameraState is set, because drawnFeatures and uploadedGeoJson are initialized as empty arrays (truthy values).

This means updateDrawingContext will be called on every camera state change (pan, zoom, etc.), even when there are no actual drawn features or uploaded GeoJSON to persist.

🐛 Suggested fix to only save when there's actual data
   // useEffect to call the server action when drawnFeatures or uploadedGeoJson changes
   useEffect(() => {
-    if (id && (mapData.drawnFeatures || mapData.uploadedGeoJson) && mapData.cameraState) {
+    const hasDrawnFeatures = mapData.drawnFeatures && mapData.drawnFeatures.length > 0;
+    const hasUploadedGeoJson = mapData.uploadedGeoJson && mapData.uploadedGeoJson.length > 0;
+    
+    if (id && (hasDrawnFeatures || hasUploadedGeoJson) && mapData.cameraState) {
       console.log('Chat.tsx: map data changed, calling updateDrawingContext');
       updateDrawingContext(id, {
         drawnFeatures: mapData.drawnFeatures || [],
         cameraState: mapData.cameraState,
         uploadedGeoJson: mapData.uploadedGeoJson || []
       });
     }
   }, [id, mapData.drawnFeatures, mapData.cameraState, mapData.uploadedGeoJson]);
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// useEffect to call the server action when drawnFeatures or uploadedGeoJson changes
useEffect(() => {
if (id && mapData.drawnFeatures && mapData.cameraState) {
console.log('Chat.tsx: drawnFeatures changed, calling updateDrawingContext', mapData.drawnFeatures);
if (id && (mapData.drawnFeatures || mapData.uploadedGeoJson) && mapData.cameraState) {
console.log('Chat.tsx: map data changed, calling updateDrawingContext');
updateDrawingContext(id, {
drawnFeatures: mapData.drawnFeatures,
drawnFeatures: mapData.drawnFeatures || [],
cameraState: mapData.cameraState,
uploadedGeoJson: mapData.uploadedGeoJson || []
});
}
}, [id, mapData.drawnFeatures, mapData.cameraState]);
}, [id, mapData.drawnFeatures, mapData.cameraState, mapData.uploadedGeoJson]);
// useEffect to call the server action when drawnFeatures or uploadedGeoJson changes
useEffect(() => {
const hasDrawnFeatures = mapData.drawnFeatures && mapData.drawnFeatures.length > 0;
const hasUploadedGeoJson = mapData.uploadedGeoJson && mapData.uploadedGeoJson.length > 0;
if (id && (hasDrawnFeatures || hasUploadedGeoJson) && mapData.cameraState) {
console.log('Chat.tsx: map data changed, calling updateDrawingContext');
updateDrawingContext(id, {
drawnFeatures: mapData.drawnFeatures || [],
cameraState: mapData.cameraState,
uploadedGeoJson: mapData.uploadedGeoJson || []
});
}
}, [id, mapData.drawnFeatures, mapData.cameraState, mapData.uploadedGeoJson]);
🤖 Prompt for AI Agents
In `@components/chat.tsx` around lines 92 - 102, The effect's condition always
becomes true once cameraState exists because arrays are truthy; change the
condition in the useEffect that calls updateDrawingContext so it only runs when
id exists, cameraState exists, and at least one of mapData.drawnFeatures or
mapData.uploadedGeoJson contains items (check .length > 0 or truthy length via
optional chaining), e.g., use mapData.drawnFeatures?.length > 0 ||
mapData.uploadedGeoJson?.length > 0 before calling updateDrawingContext(id,
...); keep the same payload shape and dependency array ([id,
mapData.drawnFeatures, mapData.cameraState, mapData.uploadedGeoJson]).

Comment on lines +84 to +86
{mapData.uploadedGeoJson?.map(item => (
item.visible && <GoogleGeoJsonLayer key={item.id} data={item.data} />
))}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Conditional rendering inside .map() can return false to the array.

When item.visible is falsy, the expression item.visible && <GoogleGeoJsonLayer ... /> returns false, which React will include in the rendered output (though it won't display anything). For cleaner behavior, filter before mapping or use a ternary with null:

♻️ Suggested refactor
-      {mapData.uploadedGeoJson?.map(item => (
-        item.visible && <GoogleGeoJsonLayer key={item.id} data={item.data} />
-      ))}
+      {mapData.uploadedGeoJson
+        ?.filter(item => item.visible)
+        .map(item => (
+          <GoogleGeoJsonLayer key={item.id} data={item.data} />
+        ))}
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
{mapData.uploadedGeoJson?.map(item => (
item.visible && <GoogleGeoJsonLayer key={item.id} data={item.data} />
))}
{mapData.uploadedGeoJson
?.filter(item => item.visible)
.map(item => (
<GoogleGeoJsonLayer key={item.id} data={item.data} />
))}
🤖 Prompt for AI Agents
In `@components/map/google-map.tsx` around lines 84 - 86, The map callback
currently uses a logical && which can push false values into the JSX array;
update the rendering of mapData.uploadedGeoJson so you either filter out
invisible items first (e.g., uploadedGeoJson.filter(item =>
item.visible).map(item => <GoogleGeoJsonLayer key={item.id} data={item.data}
/>)) or use a ternary to return null for invisible items (item.visible ?
<GoogleGeoJsonLayer key={item.id} data={item.data} /> : null); apply this change
where mapData.uploadedGeoJson is iterated to ensure only valid React nodes (no
false values) are returned.

Comment on lines +32 to +37
uploadedGeoJson?: Array<{
id: string;
filename: string;
data: any; // FeatureCollection
visible: boolean;
}>;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Consider typing data as FeatureCollection for type safety.

The data: any type loses type safety. Based on the GeoJSON layer components (e.g., GeoJsonLayerProps in geojson-layer.tsx), this should be a GeoJSON FeatureCollection.

♻️ Suggested type improvement
+import type { FeatureCollection } from 'geojson';
+
 // ... in MapData interface
   uploadedGeoJson?: Array<{
     id: string;
     filename: string;
-    data: any; // FeatureCollection
+    data: FeatureCollection;
     visible: boolean;
   }>;
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
uploadedGeoJson?: Array<{
id: string;
filename: string;
data: any; // FeatureCollection
visible: boolean;
}>;
uploadedGeoJson?: Array<{
id: string;
filename: string;
data: FeatureCollection;
visible: boolean;
}>;
🤖 Prompt for AI Agents
In `@components/map/map-data-context.tsx` around lines 32 - 37, The
uploadedGeoJson entry uses data: any which loses type safety; update the type in
map-data-context.tsx to use GeoJSON's FeatureCollection (import
FeatureCollection from 'geojson' or the project's GeoJSON types) so
uploadedGeoJson?: Array<{ id: string; filename: string; data: FeatureCollection;
visible: boolean; }>; reference the existing GeoJsonLayerProps in
geojson-layer.tsx to ensure the FeatureCollection generic matches the layer's
expected geometry/props types and adjust imports accordingly.

Comment on lines +9 to +13
interface MapDataUpdaterProps {
id: string;
data: any; // FeatureCollection or Feature
filename: string;
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Consider stronger typing for the data prop.

Using any here loses type safety. Consider using a union type to be explicit about expected inputs:

+import type { Feature } from 'geojson';
+
 interface MapDataUpdaterProps {
   id: string;
-  data: any; // FeatureCollection or Feature
+  data: FeatureCollection | Feature | null;
   filename: string;
 }
🤖 Prompt for AI Agents
In `@components/map/map-data-updater.tsx` around lines 9 - 13, The
MapDataUpdaterProps interface uses a loose any for the data prop; change it to
an explicit GeoJSON union (e.g., GeoJSON.FeatureCollection | GeoJSON.Feature) by
importing the GeoJSON types and replacing the data: any declaration in
MapDataUpdaterProps with the appropriate union type; also update any usages of
the data prop (in the MapDataUpdater component and callers) to satisfy the
stronger type (narrow with type guards where you expect Feature vs
FeatureCollection).

Comment on lines +22 to +25
// Ensure it's a FeatureCollection for consistency
const featureCollection: FeatureCollection = data.type === 'FeatureCollection'
? data
: { type: 'FeatureCollection', features: [data] };
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Potential silent failure with invalid GeoJSON data.

The normalization logic assumes data is either a FeatureCollection or a valid Feature. If data has an unexpected structure (e.g., data.type is "Point" or "Polygon" — raw geometries rather than Features), wrapping it as { type: 'FeatureCollection', features: [data] } will produce an invalid FeatureCollection that may cause downstream rendering issues.

Consider adding validation:

🛡️ Proposed validation
+    // Validate that data is a Feature or FeatureCollection
+    if (data.type !== 'FeatureCollection' && data.type !== 'Feature') {
+      console.warn('MapDataUpdater: Invalid GeoJSON type, expected Feature or FeatureCollection:', data.type);
+      return;
+    }
+
     // Ensure it's a FeatureCollection for consistency
     const featureCollection: FeatureCollection = data.type === 'FeatureCollection'
       ? data
       : { type: 'FeatureCollection', features: [data] };
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// Ensure it's a FeatureCollection for consistency
const featureCollection: FeatureCollection = data.type === 'FeatureCollection'
? data
: { type: 'FeatureCollection', features: [data] };
// Validate that data is a Feature or FeatureCollection
if (data.type !== 'FeatureCollection' && data.type !== 'Feature') {
console.warn('MapDataUpdater: Invalid GeoJSON type, expected Feature or FeatureCollection:', data.type);
return;
}
// Ensure it's a FeatureCollection for consistency
const featureCollection: FeatureCollection = data.type === 'FeatureCollection'
? data
: { type: 'FeatureCollection', features: [data] };
🤖 Prompt for AI Agents
In `@components/map/map-data-updater.tsx` around lines 22 - 25, The normalization
blindly treats non-FeatureCollection inputs as Features and can produce invalid
GeoJSON; update the logic around the featureCollection variable to validate the
incoming data.type: if data.type === 'FeatureCollection' keep it, if data.type
=== 'Feature' use it directly, if data.type is a GeoJSON Geometry type (e.g.,
'Point','Polygon','LineString', etc.) wrap it in a Feature object ({ type:
'Feature', geometry: data, properties: {} }), and otherwise handle invalid input
by logging an error via the same logger and returning or falling back to an
empty FeatureCollection ({ type: 'FeatureCollection', features: [] }) so
downstream consumers only ever receive a valid FeatureCollection of Feature
objects. Ensure this uses the same variable names (featureCollection, data) so
the change is easy to locate.

Comment on lines 47 to 59
// Fly to the extent of the GeoJSON
if (map && featureCollection.features.length > 0) {
try {
const bbox = turf.bbox(featureCollection);
map.fitBounds(bbox as [number, number, number, number], {
padding: 50,
maxZoom: 15,
duration: 2000
});
} catch (e) {
console.error('Failed to fit bounds for GeoJSON:', e);
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

# Check how useMap is implemented and what map instance it returns
ast-grep --pattern 'export const useMap = $_'

Repository: QueueLab/QCX

Length of output: 489


🏁 Script executed:

# Check for fitBounds usage and context
rg -n 'fitBounds' --type=ts --type=tsx -C3

Repository: QueueLab/QCX

Length of output: 83


🏁 Script executed:

# Look for Google Maps imports or setup
rg -n 'google\.maps|@react-google-maps|GoogleMap' --type=ts --type=tsx

Repository: QueueLab/QCX

Length of output: 83


🏁 Script executed:

# Check the context around useMap hook to understand the provider
rg -n 'useMap' --type=ts --type=tsx -B2 -A2 | head -50

Repository: QueueLab/QCX

Length of output: 83


🏁 Script executed:

# Find MapContextType definition
rg -n 'MapContextType' -A5 | head -30

Repository: QueueLab/QCX

Length of output: 1138


🏁 Script executed:

# Find MapProvider implementation
rg -n 'MapProvider|export.*Provider' components/map/ -A10 | head -50

Repository: QueueLab/QCX

Length of output: 2955


🏁 Script executed:

# Search for fitBounds without type filter
rg -n 'fitBounds' -C3

Repository: QueueLab/QCX

Length of output: 584


🏁 Script executed:

# Search for Google Maps references
rg -n 'google.maps|@react-google-maps|GoogleMap'

Repository: QueueLab/QCX

Length of output: 3033


🏁 Script executed:

# Check package.json to see what mapping libraries are used
cat package.json | grep -A5 -B5 'mapbox\|google\|map'

Repository: QueueLab/QCX

Length of output: 1895


🏁 Script executed:

# Check what map-data-updater.tsx imports and which useMap it uses
head -20 components/map/map-data-updater.tsx

Repository: QueueLab/QCX

Length of output: 584


🏁 Script executed:

# Find the Mapbox component to see if it sets map in context
rg -n 'export.*Mapbox|function Mapbox' components/map/ -A15

Repository: QueueLab/QCX

Length of output: 1615


🏁 Script executed:

# Check GoogleMapComponent implementation
rg -n 'export function GoogleMapComponent' components/map/ -A30

Repository: QueueLab/QCX

Length of output: 1955


🏁 Script executed:

# Check if there's any conditional handling for different map types
rg -n 'MapboxMap|google.maps.Map' components/map/map-data-updater.tsx

Repository: QueueLab/QCX

Length of output: 38


🏁 Script executed:

# Get more of GoogleMapComponent implementation
cat components/map/google-map.tsx | head -80

Repository: QueueLab/QCX

Length of output: 2406


🏁 Script executed:

# Check if google-geojson-layer uses a different approach
cat components/map/google-geojson-layer.tsx

Repository: QueueLab/QCX

Length of output: 1414


🏁 Script executed:

# Verify what providers are actually selectable in settings
rg -n 'mapProvider.*google|mapProvider.*mapbox' components/ -i

Repository: QueueLab/QCX

Length of output: 434


🏁 Script executed:

# Check where MapDataUpdater is imported/used
rg -n 'MapDataUpdater' components/

Repository: QueueLab/QCX

Length of output: 398


🏁 Script executed:

# Check if there are any other places attempting to call fitBounds or similar
rg -n 'fitBounds|flyTo|panTo' components/ -B2 -A2

Repository: QueueLab/QCX

Length of output: 750


🏁 Script executed:

# Verify the actual provider switching logic
cat components/map/map-provider.tsx

Repository: QueueLab/QCX

Length of output: 843


Implement bounds-fitting for Google Maps or restrict MapDataUpdater to Mapbox.

When using the Google Maps provider, the map object in context is null because GoogleMapComponent manages maps independently via @vis.gl/react-google-maps. This causes the fitBounds call (line 51) to silently skip due to the if (map && ...) guard, leaving uploaded GeoJSON uncentered in Google Maps.

Either:

  1. Implement bounds-fitting for Google Maps using its native API (e.g., google.maps.LatLngBounds)
  2. Restrict MapDataUpdater to Mapbox only
  3. Pass the Google Maps instance through context alongside Mapbox

The current code works without errors only because the bounds-fitting is silently skipped for Google Maps.

Comment on lines +165 to 166
export async function updateDrawingContext(chatId: string, contextData: { drawnFeatures: any[], cameraState: any, uploadedGeoJson?: any[] }) {
'use server';
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Redundant 'use server' directive.

The 'use server' directive on line 166 is unnecessary since the file already has 'use server' at the top (line 1). This inner directive has no effect and should be removed for clarity.

🧹 Suggested cleanup
 export async function updateDrawingContext(chatId: string, contextData: { drawnFeatures: any[], cameraState: any, uploadedGeoJson?: any[] }) {
-  'use server';
   console.log('[Action] updateDrawingContext called for chatId:', chatId);
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
export async function updateDrawingContext(chatId: string, contextData: { drawnFeatures: any[], cameraState: any, uploadedGeoJson?: any[] }) {
'use server';
export async function updateDrawingContext(chatId: string, contextData: { drawnFeatures: any[], cameraState: any, uploadedGeoJson?: any[] }) {
console.log('[Action] updateDrawingContext called for chatId:', chatId);
🤖 Prompt for AI Agents
In `@lib/actions/chat.ts` around lines 165 - 166, Remove the redundant inner 'use
server' directive inside the updateDrawingContext function; the module already
declares 'use server' at the top, so delete the string literal within the
updateDrawingContext function body (leave the top-level directive intact) to
clean up the function declaration and avoid a no-op directive.

Comment on lines +406 to +411
} else if (parsedData.type === 'FeatureCollection' || parsedData.type === 'Feature') {
// Direct GeoJSON response
mcpData = {
location: {}, // Will be derived from bbox if needed, or left empty
geoJson: parsedData
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Empty location object may cause issues downstream.

When handling direct GeoJSON responses, location is set to an empty object. However, MapQueryHandler (lines 36-59) checks for latitude and longitude in the location and logs warnings when they're invalid. This will trigger the warning path and potentially clear targetPosition.

Consider deriving a centroid from the GeoJSON bbox or leaving location as undefined/null to signal that location data should be derived from the GeoJSON itself:

🛡️ Proposed fix
         } else if (parsedData.type === 'FeatureCollection' || parsedData.type === 'Feature') {
           // Direct GeoJSON response
           mcpData = {
-            location: {}, // Will be derived from bbox if needed, or left empty
+            location: {
+              place_name: 'GeoJSON Result'
+            },
             geoJson: parsedData
           };
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
} else if (parsedData.type === 'FeatureCollection' || parsedData.type === 'Feature') {
// Direct GeoJSON response
mcpData = {
location: {}, // Will be derived from bbox if needed, or left empty
geoJson: parsedData
};
} else if (parsedData.type === 'FeatureCollection' || parsedData.type === 'Feature') {
// Direct GeoJSON response
mcpData = {
location: {
place_name: 'GeoJSON Result'
},
geoJson: parsedData
};
🤖 Prompt for AI Agents
In `@lib/agents/tools/geospatial.tsx` around lines 406 - 411, When parsedData is a
GeoJSON Feature or FeatureCollection in the geospatial tool, do not set
mcpData.location to an empty object because MapQueryHandler (and its
latitude/longitude checks and targetPosition handling) treats empty objects as
invalid; instead either derive a centroid from parsedData.bbox and populate
mcpData.location.latitude and mcpData.location.longitude, or set
mcpData.location to null/undefined to signal that location should be derived
from the GeoJSON later; update the branch handling parsedData.type ===
'FeatureCollection' || 'Feature' to compute a centroid from parsedData.bbox (or
leave location null) and attach geoJson to mcpData so MapQueryHandler can
process it correctly.

Copy link

@charliecreates charliecreates bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GeoJSON uploads currently bloat LLM prompts because the full file contents are still appended as a text message in submit, even when successfully ingested into the map pipeline. UI hydration has a correctness risk: geojson_upload is parsed without a try/catch, and replaying uploadedGeoJson via MapDataUpdater can cause repeated fitBounds camera jumps (especially on chat reload). Tool ingestion uses timestamp as an id, which is not reliably unique/stable and can lead to unintended de-duping. Several places rely on any for GeoJSON/state payloads, making the pipeline easier to break silently.

Additional notes (1)
  • Performance | app/actions.tsx:316-354

GeoJSON uploads are still appended into the user prompt

In submit, GeoJSON files are treated as text/plain and their full contents are appended into messageParts as a text message even when they’re successfully parsed and handled via geojson_upload + MapDataUpdater.

That has several real downsides:

  • Token bloat / cost: large GeoJSONs will inflate the prompt dramatically.
  • Leaky / noisy context: the model sees raw coordinates and properties that it doesn’t need to answer many questions.
  • Duplicated behavior: you already persist/render GeoJSON via map state; repeating it in chat input is redundant.

If the intended UX is “upload GeoJSON to the map”, the text payload should typically not be sent to the LLM unless the user explicitly wants it.

Summary of changes

What changed

GeoJSON ingestion pipeline

  • Added support for uploading and ingesting GeoJSON (.geojson, application/geo+json) through the chat file upload flow.
  • Introduced a new headless client component MapDataUpdater to:
    • normalize input to a FeatureCollection
    • add items to MapDataContext.uploadedGeoJson
    • auto-fitBounds using Turf.js

Map state + rendering

  • Extended MapData with uploadedGeoJson (id/filename/data/visible) and initialized it in MapDataProvider.
  • Render uploaded GeoJSON layers in both:
    • components/map/mapbox-map.tsx via <GeoJsonLayer />
    • components/map/google-map.tsx via <GoogleGeoJsonLayer />

Tool integration

  • Extended geospatialTool parsing to pass through geoJson from tool output.
  • Updated MapQueryHandler to render a MapDataUpdater when tool output includes GeoJSON.

Persistence / UI reconstruction

  • Updated getUIStateFromAIState to:
    • handle a new geojson_upload assistant message type
    • handle data messages that include uploadedGeoJson and replay via MapDataUpdater

Testability

  • Added data-testid attributes to file input, bot messages, and user messages.

Comment on lines +327 to +346
const geoJsonId = nanoid()
// Add a special message to track the GeoJSON upload
aiState.update({
...aiState.get(),
messages: [
...aiState.get().messages,
{
id: geoJsonId,
role: 'assistant',
content: JSON.stringify({ data: geoJson, filename: file.name }),
type: 'geojson_upload'
}
]
})

// Immediately append the updater to the UI stream
uiStream.append(
<MapDataUpdater id={geoJsonId} data={geoJson} filename={file.name} />
)
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Server action appends a React client component into the UI stream

submit (a server action) is directly appending <MapDataUpdater /> into uiStream. That means a server-side path is deciding to render a client component as a side effect of file parsing.

This is fragile for a few reasons:

  • It tightly couples server parsing with a specific client synchronization mechanism.
  • It risks odd replay behavior (e.g., streaming replays, retries) causing multiple MapDataUpdater mounts.
  • It makes the UI update dependent on server streaming rather than the persisted AI state reconstruction (which you already added via geojson_upload in getUIStateFromAIState).

Given you already insert a geojson_upload message into aiState, UI reconstruction can naturally render the updater from state without also pushing it immediately.

Suggestion

Remove the immediate uiStream.append(<MapDataUpdater ... />) and rely on the geojson_upload message being rendered via getUIStateFromAIState.

If you need instant feedback before the assistant message is produced, consider appending a lightweight server-renderable placeholder (e.g., “Uploaded X”) and let the client handle map updates based on AI state.

Reply with "@CharlieHelps yes please" if you'd like me to add a commit making this change.

Comment on lines +751 to +757
case 'geojson_upload': {
const { data, filename } = JSON.parse(content as string)
return {
id,
component: <MapDataUpdater id={id} data={data} filename={filename} />
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unvalidated JSON.parse for geojson_upload can crash UI rendering

getUIStateFromAIState parses content for geojson_upload without a try/catch. Any malformed content (db corruption, older message format, partial writes) will throw and can break UI state mapping.

You already wrap parsing for other message kinds; this one should be consistent.

Suggestion

Wrap the JSON.parse in a try/catch (similar to the tool and data cases) and return { id, component: null } on failure.

Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this defensive parsing.

Comment on lines +817 to +836
case 'data':
try {
const contextData = JSON.parse(content as string)
if (contextData.uploadedGeoJson && Array.isArray(contextData.uploadedGeoJson)) {
return {
id,
component: (
<>
{contextData.uploadedGeoJson.map((item: any) => (
<MapDataUpdater key={item.id} id={item.id} data={item.data} filename={item.filename} />
))}
</>
)
}
}
return { id, component: null }
} catch (e) {
console.error('Error parsing data message:', e)
return { id, component: null }
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

data message rendering uses any and replays side effects without guarding

In the data case you iterate uploadedGeoJson.map((item: any) => <MapDataUpdater ... />). Two concerns:

  1. Using any here makes it easy to introduce unsafe shapes that still compile. This is UI-state hydration code; it should be strict.
  2. Rendering many MapDataUpdaters will attempt fitBounds for each layer; on chat reload this can cause multiple sequential camera jumps. The current MapDataUpdater only de-dupes by id for inserting into context, but it does not de-dupe the fitBounds side effect. So reload may result in the map fitting to the last item every time, even if the user previously set a different view.
Suggestion
  1. Replace any with a minimal structural type (e.g., { id: string; data: unknown; filename: string }) and validate required fields before rendering.

  2. Add a flag to MapDataUpdater like shouldFlyTo?: boolean and set it to false when replaying persisted uploadedGeoJson from a data message, or add internal logic to only fitBounds on first-ever insert (e.g., only when it was newly added to context).

Reply with "@CharlieHelps yes please" if you'd like me to add a commit implementing shouldFlyTo and tightening the hydration type.

Comment on lines +81 to +91
return (
<>
{toolOutput?.mcp_response?.geoJson && (
<MapDataUpdater
id={toolOutput.timestamp}
data={toolOutput.mcp_response.geoJson}
filename={toolOutput.mcp_response.location?.place_name || 'Tool Result'}
/>
)}
</>
);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tool-derived GeoJSON id is not stable/unique enough

MapQueryHandler uses toolOutput.timestamp as the MapDataUpdater id. If two tool calls happen in the same timestamp granularity (or if timestamps are reused across reloads), this can:

  • cause unintended de-dupe (layer never appears)
  • overwrite expectations when persisting/replaying

Also, timestamps aren’t semantically tied to the GeoJSON payload.

Suggestion

Generate a unique id for each tool GeoJSON emission (e.g., nanoid() on the client, or use a deterministic hash of the GeoJSON if you want stable ids), and keep timestamp as metadata.

Reply with "@CharlieHelps yes please" if you'd like me to add a commit updating the id generation strategy.

Comment on lines 19 to 60
useEffect(() => {
if (!data) return;

// Ensure it's a FeatureCollection for consistency
const featureCollection: FeatureCollection = data.type === 'FeatureCollection'
? data
: { type: 'FeatureCollection', features: [data] };

// Update MapData context
setMapData(prev => {
// Avoid duplicate entries
const alreadyExists = prev.uploadedGeoJson?.some(item => item.id === id);
if (alreadyExists) return prev;

return {
...prev,
uploadedGeoJson: [
...(prev.uploadedGeoJson || []),
{
id,
filename,
data: featureCollection,
visible: true
}
]
};
});

// Fly to the extent of the GeoJSON
if (map && featureCollection.features.length > 0) {
try {
const bbox = turf.bbox(featureCollection);
map.fitBounds(bbox as [number, number, number, number], {
padding: 50,
maxZoom: 15,
duration: 2000
});
} catch (e) {
console.error('Failed to fit bounds for GeoJSON:', e);
}
}
}, [id, data, filename, setMapData, map]);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MapDataUpdater triggers camera movement on every mount; this will be disruptive

MapDataUpdater always fitBounds whenever it mounts with data. This is likely fine for a single upload, but it becomes disruptive when:

  • multiple layers are replayed (e.g., from persisted state) → multiple consecutive camera jumps
  • the component remounts due to UI state reconciliation

Because this is a headless “sync” component, controlling side effects is crucial.

Suggestion

Add an explicit prop (e.g., flyTo?: boolean defaulting to true) and/or only run fitBounds if the layer was newly inserted into uploadedGeoJson (i.e., inside the setMapData updater when alreadyExists === false).

That way, replay/hydration can set flyTo={false} and avoid surprising camera behavior.

Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this change.

Comment on lines 47 to 55
// Fly to the extent of the GeoJSON
if (map && featureCollection.features.length > 0) {
try {
const bbox = turf.bbox(featureCollection);
map.fitBounds(bbox as [number, number, number, number], {
padding: 50,
maxZoom: 15,
duration: 2000
});
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MapDataUpdater uses map.fitBounds(bbox as [number, number, number, number], ...) with a Turf bbox. Two concerns:

  1. fitBounds APIs differ across providers. If useMap() can point at multiple map implementations, this can become a runtime hazard.
  2. fitBounds generally expects a LngLatBoundsLike (often [[west, south],[east, north]]), while Turf returns [minX, minY, maxX, maxY]. Some libs accept the 4-tuple; others do not.

Right now you’re relying on a cast to force it through.

Suggestion

Normalize the bounds shape and isolate provider differences:

  • Convert Turf bbox to a canonical bounds object/tuple your useMap() abstraction guarantees.
  • If useMap() is Mapbox-only, pass [[minX, minY],[maxX, maxY]] to be explicit.

Example:

const [minX, minY, maxX, maxY] = turf.bbox(featureCollection)
const bounds: [[number, number],[number, number]] = [[minX, minY], [maxX, maxY]]
map.fitBounds(bounds, { padding: 50, maxZoom: 15, duration: 2000 })

Reply with "@CharlieHelps yes please" if you’d like me to add a commit with this change (and optionally a small adapter in useMap).

Comment on lines 92 to 102
// useEffect to call the server action when drawnFeatures or uploadedGeoJson changes
useEffect(() => {
if (id && mapData.drawnFeatures && mapData.cameraState) {
console.log('Chat.tsx: drawnFeatures changed, calling updateDrawingContext', mapData.drawnFeatures);
if (id && (mapData.drawnFeatures || mapData.uploadedGeoJson) && mapData.cameraState) {
console.log('Chat.tsx: map data changed, calling updateDrawingContext');
updateDrawingContext(id, {
drawnFeatures: mapData.drawnFeatures,
drawnFeatures: mapData.drawnFeatures || [],
cameraState: mapData.cameraState,
uploadedGeoJson: mapData.uploadedGeoJson || []
});
}
}, [id, mapData.drawnFeatures, mapData.cameraState]);
}, [id, mapData.drawnFeatures, mapData.cameraState, mapData.uploadedGeoJson]);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This effect logs every map-data change and triggers a server action whenever uploadedGeoJson changes. Because uploadedGeoJson elements contain full feature collections, any update that changes object identity (even if semantically the same) can cause repeated writes.

Also, sending full GeoJSON on every change is likely heavy. If MapDataUpdater adds one layer, this effect will upload the entire uploadedGeoJson array back to the server each time.

Suggestion

Reduce redundant writes and payload size:

  • Remove the console.log (or gate it behind a debug flag).
  • Persist only deltas (the newly added layer) instead of the whole array, or persist only IDs + references if you already store the GeoJSON elsewhere.
  • If you keep this approach, debounce the updateDrawingContext call (e.g., 300–500ms) and/or compute a stable hash of uploadedGeoJson to avoid writes on identity-only changes.

Reply with "@CharlieHelps yes please" if you’d like me to add a commit implementing a debounce + change detection.

Comment on lines +32 to +37
uploadedGeoJson?: Array<{
id: string;
filename: string;
data: any; // FeatureCollection
visible: boolean;
}>;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

uploadedGeoJson is typed as data: any in context and MapDataUpdaterProps, and geoJson?: any in tool output. Since this pipeline crosses persistence boundaries and executes viewport operations, this is a good place to tighten runtime validation and narrow types.

Even a lightweight validation (checking type and features array) would prevent downstream runtime errors and reduce the surface area for malicious/accidental payloads.

Suggestion

Replace any with Feature | FeatureCollection (from geojson) and add a small runtime validator at ingestion points (upload + tool) that rejects anything else.

Example types:

import type { Feature, FeatureCollection } from 'geojson'

data: FeatureCollection | Feature

Reply with "@CharlieHelps yes please" if you’d like me to add a commit that tightens types and adds a shared isGeoJsonFeatureOrCollection() helper.

@charliecreates charliecreates bot removed the request for review from CharlieHelps February 1, 2026 17:02
- Extend MapData context to support uploaded GeoJSON layers
- Update ChatPanel to support GeoJSON file selection and add test IDs
- Implement MapDataUpdater component for automatic context sync and map framing
- Update Mapbox and Google Maps components to render uploaded GeoJSON
- Enable AI tools to ingest GeoJSON into the map pipeline via MapQueryHandler
- Ensure persistence of GeoJSON data across chat sessions via database sync
- Update AIMessage type to support 'geojson_upload'
- Fix ESLint warnings to ensure clean build performance

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
components/map/mapbox-map.tsx (1)

356-458: ⚠️ Potential issue | 🟠 Major

Map re-initialization on camera updates (Line 458).
Including mapData.cameraState and position in this effect’s deps will destroy and recreate the map on every camera update (since cleanup runs before re-run). This can cause flicker, lost state, and heavy performance regressions.

🛠️ Proposed fix: keep initialization effect stable
-  }, [setMap, setIsMapLoaded, captureMapCenter, handleUserInteraction, stopRotation, mapData.cameraState, position?.latitude, position?.longitude])
+  }, [setMap, setIsMapLoaded, captureMapCenter, handleUserInteraction, stopRotation])
📜 Review details

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 1b56819 and 9f79929.

📒 Files selected for processing (3)
  • components/chat-panel.tsx
  • components/map/mapbox-map.tsx
  • lib/types/index.ts
🧰 Additional context used
🧬 Code graph analysis (1)
components/map/mapbox-map.tsx (1)
components/map/geojson-layer.tsx (4)
  • GeoJsonLayer (13-100)
  • map (24-80)
  • map (89-96)
  • GeoJsonLayerProps (8-11)
🔇 Additional comments (5)
components/chat-panel.tsx (2)

133-156: Good dependency fix for debounced suggestions.
Including setSuggestions avoids stale closure risk without changing behavior.


210-218: GeoJSON accept + test id looks good.
The accept filter now supports GeoJSON uploads and adds a stable selector for tests.

lib/types/index.ts (1)

65-77: Type union extension is consistent.
Adding 'geojson_upload' aligns with the new message flow.

components/map/mapbox-map.tsx (2)

13-16: No issues to flag here.


573-575: GeoJSON layer rendering is properly gated.
Rendering only visible layers keeps behavior predictable and aligns with the new upload pipeline.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

@ngoiyaeric
Copy link
Collaborator Author

@coderabbitai its stuck refreshing and i need to be able to copy geojson into the chat input

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 1, 2026

Tip

For best results, initiate chat on the files or code changes.

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Read the chat.tsx file to examine the useEffect that triggers updateDrawingContext
cat components/chat.tsx

Length of output: 8151


🏁 Script executed:

#!/bin/bash
# Read the MapDataUpdater to understand how it updates context
cat components/map/map-data-updater.tsx

Length of output: 1690


🏁 Script executed:

#!/bin/bash
# Read actions.tsx to see how GeoJSON is currently handled
cat app/actions.tsx | head -200

Length of output: 6910


🏁 Script executed:

#!/bin/bash
# See the rest of actions.tsx to understand the complete flow
cat app/actions.tsx | tail -100

Length of output: 3123


🏁 Script executed:

#!/bin/bash
# Get the middle section of actions.tsx to see the file handling and where router.refresh might be triggered
cat app/actions.tsx | sed -n '200,400p'

Length of output: 5986


🏁 Script executed:

#!/bin/bash
# Check if there's a router.refresh call related to response messages
rg "router\.refresh" components/chat.tsx -A 2 -B 2

Length of output: 180


🏁 Script executed:

#!/bin/bash
# Look at the updateDrawingContext function to understand what it does
cat lib/actions/chat.ts | grep -A 30 "updateDrawingContext"

Length of output: 2966


@ngoiyaeric I've identified the root cause of the infinite refresh loop and can help with both issues:

🔴 Issue 1: Infinite Refresh Loop

The loop occurs because:

  1. When the page loads, getUIStateFromAIState renders MapDataUpdater components for saved 'data' messages
  2. MapDataUpdater updates the context (even though it has duplicate check)
  3. The chat.tsx useEffect detects the context change and calls updateDrawingContext
  4. updateDrawingContext creates a new 'data' message in the database every time
  5. This triggers re-rendering and the cycle repeats

Solution:

Fix 1: Add a ref to track if data is being restored in chat.tsx:

// Add near the top with other useState/useRef hooks
const isRestoringRef = useRef(false);

// Modify the useEffect that calls updateDrawingContext:
useEffect(() => {
  // Don't call updateDrawingContext during initial restoration
  if (isRestoringRef.current) {
    isRestoringRef.current = false;
    return;
  }
  
  if (id && (mapData.drawnFeatures || mapData.uploadedGeoJson) && mapData.cameraState) {
    console.log('Chat.tsx: map data changed, calling updateDrawingContext');
    updateDrawingContext(id, {
      drawnFeatures: mapData.drawnFeatures || [],
      cameraState: mapData.cameraState,
      uploadedGeoJson: mapData.uploadedGeoJson || []
    });
  }
}, [id, mapData.drawnFeatures, mapData.cameraState, mapData.uploadedGeoJson]);

Fix 2: Update MapDataUpdater to set the restoration flag:

export function MapDataUpdater({ id, data, filename }: MapDataUpdaterProps) {
  const { setMapData } = useMapData();
  const { map } = useMap();
  const hasUpdatedRef = useRef(false); // Add this

  useEffect(() => {
    if (!data || hasUpdatedRef.current) return; // Prevent re-runs
    hasUpdatedRef.current = true; // Mark as updated

    // Ensure it's a FeatureCollection for consistency
    const featureCollection: FeatureCollection = data.type === 'FeatureCollection'
      ? data
      : { type: 'FeatureCollection', features: [data] };

    // Update MapData context
    setMapData(prev => {
      // Avoid duplicate entries
      const alreadyExists = prev.uploadedGeoJson?.some(item => item.id === id);
      if (alreadyExists) return prev;

      return {
        ...prev,
        uploadedGeoJson: [
          ...(prev.uploadedGeoJson || []),
          {
            id,
            filename,
            data: featureCollection,
            visible: true
          }
        ]
      };
    });

    // Fly to the extent of the GeoJSON (only if not already present)
    if (map && featureCollection.features.length > 0) {
      try {
        const bbox = turf.bbox(featureCollection);
        map.fitBounds(bbox as [number, number, number, number], {
          padding: 50,
          maxZoom: 15,
          duration: 2000
        });
      } catch (e) {
        console.error('Failed to fit bounds for GeoJSON:', e);
      }
    }
  }, [id, data, filename, setMapData, map]);

  return null;
}

🟢 Issue 2: Paste GeoJSON into Chat Input

Add GeoJSON detection from pasted text in actions.tsx. Insert this logic after checking for file uploads:

// After the file handling block (around line 330), add:

// Check if userInput contains GeoJSON
if (userInput && !file) {
  try {
    const trimmedInput = userInput.trim();
    // Try to parse as JSON
    if ((trimmedInput.startsWith('{') && trimmedInput.endsWith('}')) || 
        (trimmedInput.startsWith('[') && trimmedInput.endsWith(']'))) {
      const possibleGeoJson = JSON.parse(trimmedInput);
      
      // Check if it's valid GeoJSON
      if (possibleGeoJson.type === 'FeatureCollection' || possibleGeoJson.type === 'Feature') {
        const geoJsonId = nanoid();
        
        // Add a special message to track the GeoJSON upload
        aiState.update({
          ...aiState.get(),
          messages: [
            ...aiState.get().messages,
            {
              id: geoJsonId,
              role: 'assistant',
              content: JSON.stringify({ 
                data: possibleGeoJson, 
                filename: 'Pasted GeoJSON' 
              }),
              type: 'geojson_upload'
            }
          ]
        });

        // Immediately append the updater to the UI stream
        uiStream.append(
          <MapDataUpdater 
            id={geoJsonId} 
            data={possibleGeoJson} 
            filename="Pasted GeoJSON" 
          />
        );
        
        // Clear the input since we've handled it as GeoJSON
        // Don't add it to messages as regular text
        return {
          id: nanoid(),
          isGenerating: isGenerating.value,
          component: uiStream.value,
          isCollapsed: isCollapsed.value
        };
      }
    }
  } catch (e) {
    // Not valid JSON, continue with normal text handling
  }
}

Would you like me to create a plan to implement these fixes?


💡 Use @coderabbitai plan to create a plan for code changes

@google-labs-jules
Copy link
Contributor

@coderabbitai its stuck refreshing and i need to be able to copy geojson into the chat input

I've implemented support for pasting GeoJSON directly into the chat input. I've also added a synchronization guard to prevent redundant calls to updateDrawingContext, which should resolve the refresh loop you were experiencing.

- Add support for pasting GeoJSON directly into the chat input
- Implement synchronization guard in Chat.tsx to prevent redundant state saves
- Ensure 'geojson_upload' message type is correctly handled in AI state
- Improve GeoJSON extraction from geospatial tool responses

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
app/actions.tsx (1)

348-386: ⚠️ Potential issue | 🟡 Minor

Uploaded GeoJSON file content is also sent as text to the AI model.

Similar to pasted GeoJSON, after successfully parsing a GeoJSON file (lines 352-375), the raw text content is still prepended to messageParts (lines 381-386). For large GeoJSON files, this could significantly increase token usage and processing time.

Consider conditionally skipping the text prepending when GeoJSON is successfully processed:

🛡️ Proposed fix
+      let geoJsonHandled = false
       if (isGeoJson) {
         try {
           const geoJson = JSON.parse(textContent)
           if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
             const geoJsonId = nanoid()
             // Add a special message to track the GeoJSON upload
             aiState.update({
               ...aiState.get(),
               messages: [
                 ...aiState.get().messages,
                 {
                   id: geoJsonId,
                   role: 'assistant',
                   content: JSON.stringify({ data: geoJson, filename: file.name }),
                   type: 'geojson_upload'
                 }
               ]
             })

             // Immediately append the updater to the UI stream
             uiStream.append(
               <MapDataUpdater id={geoJsonId} data={geoJson} filename={file.name} />
             )
+            geoJsonHandled = true
           }
         } catch (e) {
           console.error('Failed to parse GeoJSON:', e)
         }
       }

-      const existingTextPart = messageParts.find(p => p.type === 'text')
-      if (existingTextPart) {
-        existingTextPart.text = `${textContent}\n\n${existingTextPart.text}`
-      } else {
-        messageParts.push({ type: 'text', text: textContent })
+      // Only add raw text if GeoJSON wasn't successfully handled
+      if (!geoJsonHandled) {
+        const existingTextPart = messageParts.find(p => p.type === 'text')
+        if (existingTextPart) {
+          existingTextPart.text = `${textContent}\n\n${existingTextPart.text}`
+        } else {
+          messageParts.push({ type: 'text', text: textContent })
+        }
       }
🤖 Fix all issues with AI agents
In `@app/actions.tsx`:
- Around line 780-786: The JSON.parse in the 'geojson_upload' case can throw for
malformed content; wrap the parse of content in a try-catch (the case handling
that returns component: <MapDataUpdater ... />) similar to the other role
branches: on success return the existing object with id and component:
<MapDataUpdater id={id} data={data} filename={filename} />, on parse failure
catch the error, log or report it, and return a safe fallback entry (e.g., an
object with id and a lightweight error/display component or null component) so
UI state reconstruction doesn't crash; reference the 'geojson_upload' case,
JSON.parse(content as string), and MapDataUpdater when applying the fix.
- Around line 214-241: The pasted GeoJSON block currently handles the data
(using trimmedInput, JSON.parse, aiState.update and uiStream.append with
MapDataUpdater) but does not stop further processing, so the raw JSON still gets
added to messageParts; fix this by short-circuiting after successful GeoJSON
handling — either clear userInput or set a local flag (e.g. processedGeoJson)
and return/skip the subsequent messageParts construction/send logic so the raw
JSON is not appended or sent to the AI model.

In `@components/chat.tsx`:
- Around line 105-109: The current sync flow calls updateDrawingContext (in
components/chat.tsx) which—via lib/actions/chat.ts—creates a new message record
each time and embeds the full uploadedGeoJson array, causing duplicate large
payloads; change this to either (A) send only lightweight metadata (id,
filename, visible, maybe a geoJsonRef) from the UI and persist full GeoJSON in a
separate store/table with a stable key, or (B) modify
updateDrawingContext/lib/actions/chat.ts to detect existing drawing-context
messages (by drawing id or message type) and perform an update/patch of the
existing message instead of inserting a new one; update the callsites in
components/chat.tsx to pass the reduced metadata or reference key accordingly
and ensure read paths fetch full GeoJSON by reference when rendering.
📜 Review details

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 9f79929 and bfe62f9.

📒 Files selected for processing (2)
  • app/actions.tsx
  • components/chat.tsx
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2026-01-13T13:26:30.086Z
Learnt from: ngoiyaeric
Repo: QueueLab/QCX PR: 0
File: :0-0
Timestamp: 2026-01-13T13:26:30.086Z
Learning: Workers that rely on ai.generateObject (geojsonParser, map-command-generator, feedback-analyzer) must not use the X.AI 'grok-4-fast-non-reasoning' model because X.AI’s chat/completions rejects JSON Schema structured-output arguments; prefer OpenAI gpt-4o for structured outputs.

Applied to files:

  • app/actions.tsx
🧬 Code graph analysis (2)
app/actions.tsx (1)
components/map/map-data-updater.tsx (1)
  • MapDataUpdater (15-63)
components/chat.tsx (6)
lib/actions/chat.ts (1)
  • updateDrawingContext (165-205)
components/map/mapbox-map.tsx (2)
  • prevData (166-166)
  • prevData (330-339)
components/map/map-data-context.tsx (3)
  • MapData (15-32)
  • mapData (41-49)
  • MapDataContextType (34-37)
components/map/map-3d.tsx (1)
  • prevData (35-45)
components/map/google-map.tsx (1)
  • features (38-52)
components/map/map-query-handler.tsx (2)
  • prevData (52-56)
  • prevData (39-48)
🔇 Additional comments (1)
app/actions.tsx (1)

846-865: LGTM!

The data role handling correctly parses context data with proper try-catch error handling and renders MapDataUpdater components for each uploaded GeoJSON item.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

Comment on lines 214 to 241
if (userInput) {
try {
const trimmedInput = userInput.trim()
if ((trimmedInput.startsWith('{') && trimmedInput.endsWith('}')) || (trimmedInput.startsWith('[') && trimmedInput.endsWith(']'))) {
const geoJson = JSON.parse(trimmedInput)
if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
const geoJsonId = nanoid()
aiState.update({
...aiState.get(),
messages: [
...aiState.get().messages,
{
id: geoJsonId,
role: 'assistant',
content: JSON.stringify({ data: geoJson, filename: 'Pasted GeoJSON' }),
type: 'geojson_upload'
}
]
})
uiStream.append(
<MapDataUpdater id={geoJsonId} data={geoJson} filename="Pasted GeoJSON" />
)
}
}
} catch (e) {
// Not a valid JSON, ignore
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Pasted GeoJSON is processed twice — once as upload and again as text input.

After successfully detecting and handling GeoJSON (lines 219-236), the code continues execution without returning or clearing userInput. The raw JSON text is still added to messageParts at line 334 and sent to the AI model, wasting tokens and potentially confusing the model with raw coordinate data.

Consider skipping the text message when GeoJSON is successfully parsed:

🐛 Proposed fix to prevent double processing
   if (userInput) {
     try {
       const trimmedInput = userInput.trim()
       if ((trimmedInput.startsWith('{') && trimmedInput.endsWith('}')) || (trimmedInput.startsWith('[') && trimmedInput.endsWith(']'))) {
         const geoJson = JSON.parse(trimmedInput)
         if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
           const geoJsonId = nanoid()
           aiState.update({
             ...aiState.get(),
             messages: [
               ...aiState.get().messages,
               {
                 id: geoJsonId,
                 role: 'assistant',
                 content: JSON.stringify({ data: geoJson, filename: 'Pasted GeoJSON' }),
                 type: 'geojson_upload'
               }
             ]
           })
           uiStream.append(
             <MapDataUpdater id={geoJsonId} data={geoJson} filename="Pasted GeoJSON" />
           )
+          // Skip normal text processing for GeoJSON input
+          isGenerating.done(false)
+          uiStream.done()
+          aiState.done(aiState.get())
+          return {
+            id: nanoid(),
+            isGenerating: isGenerating.value,
+            component: uiStream.value,
+            isCollapsed: isCollapsed.value
+          }
         }
       }
     } catch (e) {
       // Not a valid JSON, ignore
     }
   }
🤖 Prompt for AI Agents
In `@app/actions.tsx` around lines 214 - 241, The pasted GeoJSON block currently
handles the data (using trimmedInput, JSON.parse, aiState.update and
uiStream.append with MapDataUpdater) but does not stop further processing, so
the raw JSON still gets added to messageParts; fix this by short-circuiting
after successful GeoJSON handling — either clear userInput or set a local flag
(e.g. processedGeoJson) and return/skip the subsequent messageParts
construction/send logic so the raw JSON is not appended or sent to the AI model.

Comment on lines +780 to +786
case 'geojson_upload': {
const { data, filename } = JSON.parse(content as string)
return {
id,
component: <MapDataUpdater id={id} data={data} filename={filename} />
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Missing error handling for malformed geojson_upload content.

The JSON.parse call on line 781 will throw if content is malformed, potentially breaking the entire UI state reconstruction. Other similar cases in this function (e.g., tool role at line 790) wrap parsing in try-catch.

🛡️ Proposed fix
             case 'geojson_upload': {
+              try {
                 const { data, filename } = JSON.parse(content as string)
                 return {
                   id,
                   component: <MapDataUpdater id={id} data={data} filename={filename} />
                 }
+              } catch (e) {
+                console.error('Error parsing geojson_upload content:', e)
+                return { id, component: null }
+              }
             }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
case 'geojson_upload': {
const { data, filename } = JSON.parse(content as string)
return {
id,
component: <MapDataUpdater id={id} data={data} filename={filename} />
}
}
case 'geojson_upload': {
try {
const { data, filename } = JSON.parse(content as string)
return {
id,
component: <MapDataUpdater id={id} data={data} filename={filename} />
}
} catch (e) {
console.error('Error parsing geojson_upload content:', e)
return { id, component: null }
}
}
🤖 Prompt for AI Agents
In `@app/actions.tsx` around lines 780 - 786, The JSON.parse in the
'geojson_upload' case can throw for malformed content; wrap the parse of content
in a try-catch (the case handling that returns component: <MapDataUpdater ...
/>) similar to the other role branches: on success return the existing object
with id and component: <MapDataUpdater id={id} data={data} filename={filename}
/>, on parse failure catch the error, log or report it, and return a safe
fallback entry (e.g., an object with id and a lightweight error/display
component or null component) so UI state reconstruction doesn't crash; reference
the 'geojson_upload' case, JSON.parse(content as string), and MapDataUpdater
when applying the fix.

@ngoiyaeric
Copy link
Collaborator Author

@coderabbitai the application is refreshing and losing context

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 1, 2026

Tip

For best results, initiate chat on the files or code changes.

🧩 Analysis chain

@ngoiyaeric Let me examine the current code to diagnose what's causing the refresh and context loss issues.


🏁 Script executed:

#!/bin/bash
# Read the current chat.tsx to see what synchronization guard was implemented
cat components/chat.tsx

Length of output: 8582


🏁 Script executed:

#!/bin/bash
# Read MapDataUpdater to check if it has proper guards
cat components/map/map-data-updater.tsx

Length of output: 1690


🏁 Script executed:

#!/bin/bash
# Check for any router.refresh calls that might be causing unwanted refreshes
rg "router\.refresh" --type=ts --type=tsx -C 3

Length of output: 27


🏁 Script executed:

#!/bin/bash
# Read actions.tsx to see the pasted GeoJSON implementation
cat app/actions.tsx | head -400

Length of output: 13211


🏁 Script executed:

#!/bin/bash
# Check updateDrawingContext to see if it's still creating new messages each time
cat lib/actions/chat.ts

Length of output: 9832


Let me search for all router.refresh calls with the correct command:


🏁 Script executed:

#!/bin/bash
# Search for router.refresh calls in TypeScript/TSX files
rg "router\.refresh" -C 3

Length of output: 1935


🏁 Script executed:

#!/bin/bash
# Look at the onGetUIStateFromAIState to see how MapDataUpdater is rendered
cat app/actions.tsx | sed -n '400,600p'

Length of output: 4533


🏁 Script executed:

#!/bin/bash
# Get the onGetUIStateFromAIState function to see how MapDataUpdater is rendered on page load
cat app/actions.tsx | tail -200

Length of output: 6299


@ngoiyaeric I found the root causes of both the refresh loop and context loss! There are three critical issues working together:

🔴 Critical Issues Found

1. router.refresh() causing context loss

In components/chat.tsx (lines 76-80), there's a router.refresh() that triggers on 'response' messages:

useEffect(() => {
  if (aiState.messages[aiState.messages.length - 1]?.type === 'response') {
    // Refresh the page to chat history updates
    router.refresh()  // ← This resets ALL React state!
  }
}, [aiState, router])

This completely refreshes the page and loses all React context state (MapData, etc.).

2. Infinite database message creation

updateDrawingContext in lib/actions/chat.ts creates a NEW database message every time it's called:

const savedMessage = await dbCreateMessage(messageToSave); // Creates new message each time!

3. MapDataUpdater missing guard

MapDataUpdater has no hasUpdatedRef to prevent re-running, so it updates context on every render.

🔄 The Infinite Loop

  1. Page loads → getUIStateFromAIState renders MapDataUpdater for each 'geojson_upload' and 'data' message
  2. MapDataUpdater updates MapData context
  3. chat.tsx useEffect calls updateDrawingContext
  4. updateDrawingContext creates a NEW 'data' message in the database
  5. AiState updates, possibly triggering router.refresh()
  6. Page reloads → now there's one MORE 'data' message to render
  7. Loop repeats, creating more messages each time

✅ Complete Fix

I need to make three changes. Would you like me to create a plan to implement these fixes?

Fix 1: Remove the problematic router.refresh()

// components/chat.tsx - REMOVE OR COMMENT OUT lines 76-80
// useEffect(() => {
//   if (aiState.messages[aiState.messages.length - 1]?.type === 'response') {
//     router.refresh()
//   }
// }, [aiState, router])

Fix 2: Add hasUpdatedRef to MapDataUpdater

// components/map/map-data-updater.tsx
const hasUpdatedRef = useRef(false);

useEffect(() => {
  if (!data || hasUpdatedRef.current) return;
  hasUpdatedRef.current = true;
  // ... rest of code
}, [id, data, filename, setMapData, map]);

Fix 3: Skip updateDrawingContext during restoration
Add a flag to detect when data is being restored from the database vs. newly drawn:

// components/chat.tsx
const isRestoringRef = useRef(true);

useEffect(() => {
  // Skip during initial restoration
  if (isRestoringRef.current) {
    isRestoringRef.current = false;
    return;
  }
  // ... rest of existing code
}, [id, mapData.drawnFeatures, mapData.cameraState, mapData.uploadedGeoJson]);

Would you like me to create a complete implementation plan for these fixes?


💡 Use @coderabbitai plan to create a plan for code changes

This commit implements a comprehensive pipeline for GeoJSON data overlays:
- Updated `ChatPanel` to accept `.geojson` files and handle pasted GeoJSON text.
- Added `MapDataUpdater` headless component to sync GeoJSON to map context and automatically fit viewport bounds using Turf.js.
- Fixed an infinite refresh loop in `Chat.tsx` by introducing `lastRefreshedMessageIdRef` to guard `router.refresh()`.
- Prevented redundant camera movement loops in `MapDataUpdater` using `hasZoomedRef`.
- Updated database schema and message mapping to include `type`, `attachments`, and tool metadata, ensuring map overlays persist across page refreshes.
- Implemented `onConflictDoUpdate` (upsert) for saving messages to prevent primary key collisions.
- Supported GeoJSON ingestion from both file uploads, manual text input, and agent tools.

Co-authored-by: ngoiyaeric <115367894+ngoiyaeric@users.noreply.github.com>
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
app/actions.tsx (1)

879-886: ⚠️ Potential issue | 🟡 Minor

Add explicit return after the outer switch to handle unmapped type combinations.

The case 'user' and case 'assistant' branches contain inner switches without default cases. When a message type doesn't match any inner switch case, execution falls through the outer switch's break statement and returns undefined instead of a value. The subsequent .filter(message => message !== null) will not remove these undefined entries.

🛠️ Proposed fix
     .map((message, index) => {
       const { role, content, id, type, name } = message
       ...
       switch (role) {
         ...
       }
+      return null
     })
-    .filter(message => message !== null) as UIState
+    .filter((message): message is UIState[number] => message != null)
🤖 Fix all issues with AI agents
In `@app/search/`[id]/page.tsx:
- Around line 51-55: The mapping of DB records to AIMessage in page.tsx
currently casts nullable columns directly (dbMsg.type, dbMsg.toolName) which can
leave nulls where AIMessage expects undefined; update the mapping in the
function that constructs the AIMessage object (the block assigning
role/content/createdAt/type/name) to convert nullable values to undefined (e.g.,
use null-coalescing like dbMsg.type ?? undefined and dbMsg.toolName ?? undefined
or conditional expressions) so type and name are undefined when the DB returns
null.

In `@drizzle/migrations/0001_aromatic_ultimatum.sql`:
- Around line 1-12: This migration uses gen_random_uuid() in the calendar_notes
table but doesn't enable the pgcrypto extension; add a CREATE EXTENSION IF NOT
EXISTS "pgcrypto"; statement at the top of the first migration (before the
CREATE TABLE "calendar_notes") so gen_random_uuid() is available when the
migration runs.

In `@lib/actions/chat-db.ts`:
- Around line 123-136: The upsert currently uses
tx.insert(messages).values(messagesToInsert).onConflictDoUpdate targeting
messages.id which can overwrite rows across chats/users; modify the
onConflictDoUpdate call to include a setWhere clause that restricts updates to
rows where messages.chat_id = excluded.chat_id (or messages.chatId =
excluded.chatId) and messages.user_id = excluded.user_id (or messages.userId =
excluded.userId) so updates only apply when the incoming row matches the same
chatId and userId as the existing row; keep the same set mapping (content, role,
type, tool_name, tool_call_id, attachments) but add the setWhere guard to the
onConflictDoUpdate invocation on tx.insert/messages/messagesToInsert.

In `@lib/actions/chat.ts`:
- Around line 188-205: The current update branch in save/update logic uses
dbGetMessagesByChatId and existingMessages.find(...) which returns the first
(oldest) 'data' message and can leave the UI showing the last 'data' entry
out-of-sync with getUIStateFromAIState; change the selection to target the most
recent 'data' message (e.g., pick the last matching element by timestamp or id)
or iterate and update all messages with role === 'data' so the DB update against
messages (used in the db.update(...) block) updates the latest/each 'data'
record rather than the first one found; update references:
dbGetMessagesByChatId, existingMessages, dataMessage, and the db.update(...)
call accordingly.

In `@lib/db/schema.ts`:
- Around line 33-36: The attachments column is declared with
jsonb('attachments') which yields an untyped any; define and use a concrete type
for the JSON shape (e.g., Attachment or Attachment[] depending on expected
payload) and pass it into jsonb to lock down typings (e.g.,
jsonb<Attachment[]>('attachments')), or if null is not meaningful set a typed
default value instead; update the schema declaration that references attachments
and any related uses of toolName/toolCallId/type to use the new Attachment type
for downstream safety and remove unnecessary null checks.
📜 Review details

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between bfe62f9 and 9450b15.

⛔ Files ignored due to path filters (1)
  • bun.lock is excluded by !**/*.lock
📒 Files selected for processing (12)
  • app/actions.tsx
  • app/search/[id]/page.tsx
  • components/chat.tsx
  • components/map/map-data-updater.tsx
  • drizzle/migrations/0001_aromatic_ultimatum.sql
  • drizzle/migrations/meta/0000_snapshot.json
  • drizzle/migrations/meta/0001_snapshot.json
  • drizzle/migrations/meta/_journal.json
  • lib/actions/chat-db.ts
  • lib/actions/chat.ts
  • lib/db/schema.ts
  • package.json
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2026-01-13T13:26:30.086Z
Learnt from: ngoiyaeric
Repo: QueueLab/QCX PR: 0
File: :0-0
Timestamp: 2026-01-13T13:26:30.086Z
Learning: Workers that rely on ai.generateObject (geojsonParser, map-command-generator, feedback-analyzer) must not use the X.AI 'grok-4-fast-non-reasoning' model because X.AI’s chat/completions rejects JSON Schema structured-output arguments; prefer OpenAI gpt-4o for structured outputs.

Applied to files:

  • app/actions.tsx
🧬 Code graph analysis (5)
lib/actions/chat-db.ts (1)
lib/db/schema.ts (1)
  • messages (26-37)
lib/db/schema.ts (1)
lib/types/index.ts (1)
  • Chat (49-57)
app/search/[id]/page.tsx (1)
lib/types/index.ts (1)
  • AIMessage (59-78)
components/chat.tsx (3)
components/map/map-data-context.tsx (2)
  • MapData (15-32)
  • mapData (41-49)
lib/actions/chat.ts (1)
  • updateDrawingContext (165-222)
components/map/mapbox-map.tsx (3)
  • prevData (166-166)
  • prevData (330-339)
  • feature (78-164)
components/map/map-data-updater.tsx (2)
components/map/map-data-context.tsx (4)
  • useMapData (61-67)
  • mapData (41-49)
  • MapData (15-32)
  • MapDataContextType (34-37)
components/map/map-context.tsx (1)
  • useMap (24-30)
🪛 Biome (2.3.13)
app/actions.tsx

[error] 698-698: This callback passed to map() iterable method should always return a value.

Add missing return statements so that this callback returns a value on all execution paths.

(lint/suspicious/useIterableCallbackReturn)

🔇 Additional comments (10)
drizzle/migrations/meta/0000_snapshot.json (1)

1-178: Snapshot metadata update looks consistent.

drizzle/migrations/meta/0001_snapshot.json (1)

1-305: Snapshot looks consistent with the updated schema.

package.json (1)

65-65: No action required—drizzle-orm 0.45.1 has no breaking changes.

drizzle-orm 0.45.1 (released December 10) contains only a single bug fix for pg-native Pool detection and has no breaking changes to migrations or query APIs. The current drizzle-kit 0.31.1 is compatible with this version; ideally, both should be upgraded together, but this is an optional alignment—not a blocker.

drizzle/migrations/meta/_journal.json (1)

11-18: Journal entry addition looks consistent.

app/actions.tsx (3)

26-26: Import wiring is fine.


693-703: Filtering to the latest data message makes restoration deterministic.


859-878: Data-message GeoJSON reconstruction looks good.

components/chat.tsx (3)

75-88: Refresh guard looks solid.

The last-message ID guard should prevent repeated refresh cycles on the same response.


96-115: Snapshot-based map sync reduces redundant updates.


120-174: Layout-only wrapper changes.

Also applies to: 180-238

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

Comment on lines +51 to +55
role: dbMsg.role as AIMessage['role'],
content: dbMsg.content,
createdAt: dbMsg.createdAt ? new Date(dbMsg.createdAt) : undefined,
// 'type' and 'name' are not in the basic Drizzle 'messages' schema.
// These would be undefined unless specific logic is added to derive them.
// For instance, if a message with role 'tool' should have a 'name',
// or if some messages have a specific 'type' based on content or other flags.
// This mapping assumes standard user/assistant messages primarily.
type: dbMsg.type as AIMessage['type'],
name: dbMsg.toolName as string,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

cd app/search && find . -type f -name "page.tsx" | head -5

Repository: QueueLab/QCX

Length of output: 71


🏁 Script executed:

cd app && find . -type f -name "*.tsx" -o -name "*.ts" | grep -E "(page|types|message)" | head -20

Repository: QueueLab/QCX

Length of output: 89


🏁 Script executed:

cat -n app/search/\[id\]/page.tsx | sed -n '40,65p'

Repository: QueueLab/QCX

Length of output: 1170


🏁 Script executed:

# Find AIMessage type definition
rg -n "type AIMessage|interface AIMessage" --type=ts --type=tsx

Repository: QueueLab/QCX

Length of output: 83


🏁 Script executed:

# Check dbMsg source and context
rg -n "dbMsg" app/search/\[id\]/page.tsx -B 5 -A 2

Repository: QueueLab/QCX

Length of output: 607


🏁 Script executed:

rg -n "type AIMessage|interface AIMessage" -A 10

Repository: QueueLab/QCX

Length of output: 1274


🏁 Script executed:

rg -n "type DrizzleMessage|interface DrizzleMessage" -A 10

Repository: QueueLab/QCX

Length of output: 38


🏁 Script executed:

# Search for where these types are defined
fd -e ts -e tsx | xargs rg "export (type|interface) AIMessage"

Repository: QueueLab/QCX

Length of output: 100


🏁 Script executed:

fd -e ts -e tsx | xargs rg "export (type|interface) DrizzleMessage"

Repository: QueueLab/QCX

Length of output: 40


🏁 Script executed:

# Check for downstream usages of message.name and message.type
rg "\.name\s*\?" -t=ts -t=tsx | head -20

Repository: QueueLab/QCX

Length of output: 83


🏁 Script executed:

# Look for optional chaining or null checks on message fields
rg "message\.(name|type|toolName)" | head -30

Repository: QueueLab/QCX

Length of output: 501


🏁 Script executed:

# Find DrizzleMessage or Message type from Drizzle schema
rg -n "export type Message|export type DrizzleMessage" -A 15

Repository: QueueLab/QCX

Length of output: 1189


🏁 Script executed:

# Check the database schema or Drizzle definitions
fd -e ts -e tsx | xargs rg "messages.*table\|export const messages"

Repository: QueueLab/QCX

Length of output: 40


🏁 Script executed:

# Look at the Drizzle schema file
find . -name "*schema*" -o -name "*drizzle*" | head -10

Repository: QueueLab/QCX

Length of output: 549


🏁 Script executed:

# Check db directory structure
ls -la db/ 2>/dev/null || ls -la lib/db/ 2>/dev/null || find . -type d -name "db" | head -3

Repository: QueueLab/QCX

Length of output: 349


🏁 Script executed:

cat -n lib/db/schema.ts

Repository: QueueLab/QCX

Length of output: 4470


🏁 Script executed:

# Check the actual schema definition for the messages table
cat -n lib/schema/messages.ts 2>/dev/null || cat -n lib/db/schema.ts | grep -A 30 "messages"

Repository: QueueLab/QCX

Length of output: 3827


🏁 Script executed:

# Check the AIMessage type definition more completely
cat -n lib/types/index.ts | sed -n '59,75p'

Repository: QueueLab/QCX

Length of output: 614


🏁 Script executed:

# Check if there are any null checks on message.type or message.name in the codebase
rg "message\.(type|name)\s*\?" -A 2 -B 2

Repository: QueueLab/QCX

Length of output: 38


🏁 Script executed:

# Check how the initialMessages are used downstream
rg "initialMessages" app/search/\[id\]/page.tsx -A 5 -B 5

Repository: QueueLab/QCX

Length of output: 836


🏁 Script executed:

# Check how the type and name fields are accessed in downstream code
rg "\.type\s*==|\.type\s*!=|\.type\s*\?|\.name\s*==|\.name\s*!=" app/ -A 2 -B 2

Repository: QueueLab/QCX

Length of output: 3401


🏁 Script executed:

# Look at the AI component to see how it uses these fields
rg -n "initialAIState" app/ -A 10 | head -50

Repository: QueueLab/QCX

Length of output: 1910


🏁 Script executed:

# Check if there are assertions or strict equality checks on type/name
rg "(type|name)\s*!==" app/ -B 2 -A 2

Repository: QueueLab/QCX

Length of output: 981


Convert nullable database fields to undefined when mapping to AIMessage.

Lines 54–55 cast dbMsg.type and dbMsg.toolName without converting null values to undefined. Since the database schema defines these columns as nullable (no .notNull() constraint) and the AIMessage type expects undefined for optional fields, assign null-coalesced values to maintain type consistency.

Recommended fix
-      type: dbMsg.type as AIMessage['type'],
-      name: dbMsg.toolName as string,
+      type: dbMsg.type ?? undefined,
+      name: dbMsg.toolName ?? undefined,
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
role: dbMsg.role as AIMessage['role'],
content: dbMsg.content,
createdAt: dbMsg.createdAt ? new Date(dbMsg.createdAt) : undefined,
// 'type' and 'name' are not in the basic Drizzle 'messages' schema.
// These would be undefined unless specific logic is added to derive them.
// For instance, if a message with role 'tool' should have a 'name',
// or if some messages have a specific 'type' based on content or other flags.
// This mapping assumes standard user/assistant messages primarily.
type: dbMsg.type as AIMessage['type'],
name: dbMsg.toolName as string,
role: dbMsg.role as AIMessage['role'],
content: dbMsg.content,
createdAt: dbMsg.createdAt ? new Date(dbMsg.createdAt) : undefined,
type: dbMsg.type ?? undefined,
name: dbMsg.toolName ?? undefined,
🤖 Prompt for AI Agents
In `@app/search/`[id]/page.tsx around lines 51 - 55, The mapping of DB records to
AIMessage in page.tsx currently casts nullable columns directly (dbMsg.type,
dbMsg.toolName) which can leave nulls where AIMessage expects undefined; update
the mapping in the function that constructs the AIMessage object (the block
assigning role/content/createdAt/type/name) to convert nullable values to
undefined (e.g., use null-coalescing like dbMsg.type ?? undefined and
dbMsg.toolName ?? undefined or conditional expressions) so type and name are
undefined when the DB returns null.

Comment on lines +1 to +12
CREATE TABLE "calendar_notes" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"user_id" uuid NOT NULL,
"chat_id" uuid,
"date" timestamp with time zone NOT NULL,
"content" text NOT NULL,
"location_tags" jsonb,
"user_tags" text[],
"map_feature_id" text,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

rg -n "CREATE EXTENSION.*pgcrypto|gen_random_uuid" drizzle/migrations

Repository: QueueLab/QCX

Length of output: 1209


🏁 Script executed:

rg -n "CREATE EXTENSION" drizzle/

Repository: QueueLab/QCX

Length of output: 38


🏁 Script executed:

head -20 drizzle/migrations/0000_sweet_metal_master.sql

Repository: QueueLab/QCX

Length of output: 763


🏁 Script executed:

find . -maxdepth 2 -type f \( -name "README*" -o -name "*.md" -o -name "init*" -o -name "setup*" \) | head -20

Repository: QueueLab/QCX

Length of output: 356


🏁 Script executed:

rg -i "pgcrypto|database.*init|extension" README.md ./docs/ ./CLOUD_BUILD_SETUP.md 2>/dev/null | head -20

Repository: QueueLab/QCX

Length of output: 38


Ensure pgcrypto is enabled before using gen_random_uuid().

gen_random_uuid() requires the pgcrypto extension; it is not created in any migration file. This migration will fail unless pgcrypto is already enabled in the database. Add CREATE EXTENSION IF NOT EXISTS "pgcrypto"; at the top of the first migration, or confirm it exists in your database setup.

🛠️ Proposed fix
+CREATE EXTENSION IF NOT EXISTS "pgcrypto";
+
 CREATE TABLE "calendar_notes" (
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
CREATE TABLE "calendar_notes" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"user_id" uuid NOT NULL,
"chat_id" uuid,
"date" timestamp with time zone NOT NULL,
"content" text NOT NULL,
"location_tags" jsonb,
"user_tags" text[],
"map_feature_id" text,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
);
CREATE EXTENSION IF NOT EXISTS "pgcrypto";
CREATE TABLE "calendar_notes" (
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
"user_id" uuid NOT NULL,
"chat_id" uuid,
"date" timestamp with time zone NOT NULL,
"content" text NOT NULL,
"location_tags" jsonb,
"user_tags" text[],
"map_feature_id" text,
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
);
🤖 Prompt for AI Agents
In `@drizzle/migrations/0001_aromatic_ultimatum.sql` around lines 1 - 12, This
migration uses gen_random_uuid() in the calendar_notes table but doesn't enable
the pgcrypto extension; add a CREATE EXTENSION IF NOT EXISTS "pgcrypto";
statement at the top of the first migration (before the CREATE TABLE
"calendar_notes") so gen_random_uuid() is available when the migration runs.

Comment on lines +123 to +136
// Use upsert to avoid duplicate primary keys and prevent redundant appends
await tx.insert(messages)
.values(messagesToInsert)
.onConflictDoUpdate({
target: messages.id,
set: {
content: sql`excluded.content`,
role: sql`excluded.role`,
type: sql`excluded.type`,
toolName: sql`excluded.tool_name`,
toolCallId: sql`excluded.tool_call_id`,
attachments: sql`excluded.attachments`
}
});
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

# Find and examine the chat-db.ts file
find . -name "chat-db.ts" -type f

Repository: QueueLab/QCX

Length of output: 80


🏁 Script executed:

# Read the file to see the full context around lines 123-136
cat -n lib/actions/chat-db.ts | sed -n '100,150p'

Repository: QueueLab/QCX

Length of output: 2263


🏁 Script executed:

# Check the messages table schema definition
fd -e ts -e js | xargs grep -l "messages.*Table\|export.*messages" | head -5

Repository: QueueLab/QCX

Length of output: 95


🏁 Script executed:

# Examine the messages table schema
cat -n lib/db/schema.ts | head -150

Repository: QueueLab/QCX

Length of output: 4470


🏁 Script executed:

# Search specifically for the messages table definition
grep -n "messages.*=" lib/db/schema.ts | head -10

Repository: QueueLab/QCX

Length of output: 176


🏁 Script executed:

# Check if onConflictDoUpdate with where clause is used elsewhere in the codebase
rg "onConflictDoUpdate" -A 5 lib/

Repository: QueueLab/QCX

Length of output: 391


🏁 Script executed:

# Check Drizzle ORM version in package.json to understand API capabilities
cat package.json | grep -A 2 -B 2 "drizzle"

Repository: QueueLab/QCX

Length of output: 230


🌐 Web query:

Drizzle ORM onConflictDoUpdate where clause support version 0.45

💡 Result:

In Drizzle ORM v0.45.x, .onConflictDoUpdate() does support WHERE clauses, but not via a single where field. Instead you use:

  • targetWhere: adds WHERE ... to the conflict target (for partial indexes / ON CONFLICT (...) WHERE ...)
  • setWhere: adds WHERE ... to the DO UPDATE part (... DO UPDATE SET ... WHERE ...)

Example:

await db.insert(employees).values({...}).onConflictDoUpdate({
  target: employees.employeeId,
  targetWhere: sql`name <> 'John Doe'`,
  set: { name: sql`excluded.name` },
});

await db.insert(employees).values({...}).onConflictDoUpdate({
  target: employees.employeeId,
  set: { name: 'John Doe' },
  setWhere: sql`name <> 'John Doe'`,
});

This split (wheretargetWhere + setWhere) was introduced earlier (v0.30.8 for Postgres; v0.30.9 for SQLite), so it’s available in v0.45.0 as well. [1] [2]

Notes:

  • This applies to PostgreSQL and SQLite upserts (ON CONFLICT ...). [1]
  • MySQL uses .onDuplicateKeyUpdate(...) (and doesn’t have the same ON CONFLICT ... WHERE ... structure). [1]

Sources:
[1] Drizzle ORM Insert docs (where clauses, targetWhere/setWhere)
[2] Drizzle ORM v0.30.8 release note (split into setWhere and targetWhere)


🏁 Script executed:

# Check if there are any imports of `and`, `eq` functions from drizzle-orm in the file
grep -n "import.*from.*drizzle" lib/actions/chat-db.ts

Repository: QueueLab/QCX

Length of output: 172


🌐 Web query:

Drizzle ORM setWhere syntax and condition with eq and chatId userId

💡 Result:

In Drizzle ORM:

1) Normal UPDATE / SELECT / DELETE: use .where(...) (not setWhere)

import { and, eq } from "drizzle-orm";

await db
  .update(chatMembers)
  .set({ lastReadAt: new Date() })
  .where(and(
    eq(chatMembers.chatId, chatId),
    eq(chatMembers.userId, userId),
  ));

(.where(...) takes a single condition; combine multiple with and(...).) (orm.drizzle.team)

2) INSERT ... onConflictDoUpdate(...): setWhere is for the DO UPDATE’s WHERE

import { and, eq } from "drizzle-orm";

await db.insert(chatMembers).values({ chatId, userId, ... })
  .onConflictDoUpdate({
    target: [chatMembers.chatId, chatMembers.userId],
    set: { lastReadAt: new Date() },
    setWhere: and(
      eq(chatMembers.chatId, chatId),
      eq(chatMembers.userId, userId),
    ),
  });

setWhere / targetWhere are specific to onConflictDoUpdate. (orm.drizzle.team)

Citations:


Add setWhere guard to prevent cross-chat/user overwrites in upsert.

The upsert targets messages.id alone without guarding against conflicts. If an id collision occurs (or a malicious id is supplied), this can overwrite a message belonging to another chat or user. Add a setWhere clause that restricts updates to rows matching the same chatId and userId being inserted.

🔒 Guarded onConflict update
       await tx.insert(messages)
         .values(messagesToInsert)
         .onConflictDoUpdate({
           target: messages.id,
           set: {
             content: sql`excluded.content`,
             role: sql`excluded.role`,
             type: sql`excluded.type`,
             toolName: sql`excluded.tool_name`,
             toolCallId: sql`excluded.tool_call_id`,
             attachments: sql`excluded.attachments`
-          }
+          },
+          setWhere: and(
+            eq(messages.chatId, sql`excluded.chat_id`),
+            eq(messages.userId, sql`excluded.user_id`)
+          )
         });
🤖 Prompt for AI Agents
In `@lib/actions/chat-db.ts` around lines 123 - 136, The upsert currently uses
tx.insert(messages).values(messagesToInsert).onConflictDoUpdate targeting
messages.id which can overwrite rows across chats/users; modify the
onConflictDoUpdate call to include a setWhere clause that restricts updates to
rows where messages.chat_id = excluded.chat_id (or messages.chatId =
excluded.chatId) and messages.user_id = excluded.user_id (or messages.userId =
excluded.userId) so updates only apply when the incoming row matches the same
chatId and userId as the existing row; keep the same set mapping (content, role,
type, tool_name, tool_call_id, attachments) but add the setWhere guard to the
onConflictDoUpdate invocation on tx.insert/messages/messagesToInsert.

Comment on lines 188 to +205
try {
// We need to ensure the message is associated with the chat.
// dbCreateMessage requires chatId.
// Check if a 'data' message already exists to avoid duplication
const existingMessages = await dbGetMessagesByChatId(chatId);
const dataMessage = existingMessages.find(m => m.role === 'data');

if (dataMessage) {
// Update existing message content using direct db call since chat-db.ts doesn't have an updateMessage
const { db } = await import('@/lib/db');
const { messages } = await import('@/lib/db/schema');
const { eq } = await import('drizzle-orm');

await db.update(messages)
.set({ content: JSON.stringify(contextData) })
.where(eq(messages.id, dataMessage.id));

console.log('Drawing context updated for chat:', chatId, 'messageId:', dataMessage.id);
return { success: true, messageId: dataMessage.id };
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Update the most recent data message to stay in sync with UI restoration.

getUIStateFromAIState renders the last data message, but this update path uses find() which may pick the oldest entry if duplicates exist (legacy data). That can leave the rendered context stale after reloads. Prefer updating the latest data message (or updating all data messages for the chat) to match restoration behavior.

🐛 Proposed fix (pick latest data message)
-    const dataMessage = existingMessages.find(m => m.role === 'data');
+    const dataMessage = [...existingMessages]
+      .filter(m => m.role === 'data')
+      .sort(
+        (a, b) =>
+          new Date(b.createdAt).getTime() - new Date(a.createdAt).getTime()
+      )[0];
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
try {
// We need to ensure the message is associated with the chat.
// dbCreateMessage requires chatId.
// Check if a 'data' message already exists to avoid duplication
const existingMessages = await dbGetMessagesByChatId(chatId);
const dataMessage = existingMessages.find(m => m.role === 'data');
if (dataMessage) {
// Update existing message content using direct db call since chat-db.ts doesn't have an updateMessage
const { db } = await import('@/lib/db');
const { messages } = await import('@/lib/db/schema');
const { eq } = await import('drizzle-orm');
await db.update(messages)
.set({ content: JSON.stringify(contextData) })
.where(eq(messages.id, dataMessage.id));
console.log('Drawing context updated for chat:', chatId, 'messageId:', dataMessage.id);
return { success: true, messageId: dataMessage.id };
}
try {
// Check if a 'data' message already exists to avoid duplication
const existingMessages = await dbGetMessagesByChatId(chatId);
const dataMessage = [...existingMessages]
.filter(m => m.role === 'data')
.sort(
(a, b) =>
new Date(b.createdAt).getTime() - new Date(a.createdAt).getTime()
)[0];
if (dataMessage) {
// Update existing message content using direct db call since chat-db.ts doesn't have an updateMessage
const { db } = await import('@/lib/db');
const { messages } = await import('@/lib/db/schema');
const { eq } = await import('drizzle-orm');
await db.update(messages)
.set({ content: JSON.stringify(contextData) })
.where(eq(messages.id, dataMessage.id));
console.log('Drawing context updated for chat:', chatId, 'messageId:', dataMessage.id);
return { success: true, messageId: dataMessage.id };
}
🤖 Prompt for AI Agents
In `@lib/actions/chat.ts` around lines 188 - 205, The current update branch in
save/update logic uses dbGetMessagesByChatId and existingMessages.find(...)
which returns the first (oldest) 'data' message and can leave the UI showing the
last 'data' entry out-of-sync with getUIStateFromAIState; change the selection
to target the most recent 'data' message (e.g., pick the last matching element
by timestamp or id) or iterate and update all messages with role === 'data' so
the DB update against messages (used in the db.update(...) block) updates the
latest/each 'data' record rather than the first one found; update references:
dbGetMessagesByChatId, existingMessages, dataMessage, and the db.update(...)
call accordingly.

Comment on lines +33 to +36
attachments: jsonb('attachments'),
toolName: varchar('tool_name', { length: 100 }),
toolCallId: varchar('tool_call_id', {length: 100}),
type: varchar('type', { length: 50 })
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Consider typing attachments to avoid any propagation.

Line 33 uses jsonb('attachments') without a type, which makes downstream typings less safe. If null isn’t meaningful, a typed default also removes null checks.

♻️ Suggested typing (adjust shape as needed)
-  attachments: jsonb('attachments'),
+  attachments: jsonb('attachments').$type<unknown[]>().default([]),
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
attachments: jsonb('attachments'),
toolName: varchar('tool_name', { length: 100 }),
toolCallId: varchar('tool_call_id', {length: 100}),
type: varchar('type', { length: 50 })
attachments: jsonb('attachments').$type<unknown[]>().default([]),
toolName: varchar('tool_name', { length: 100 }),
toolCallId: varchar('tool_call_id', {length: 100}),
type: varchar('type', { length: 50 })
🤖 Prompt for AI Agents
In `@lib/db/schema.ts` around lines 33 - 36, The attachments column is declared
with jsonb('attachments') which yields an untyped any; define and use a concrete
type for the JSON shape (e.g., Attachment or Attachment[] depending on expected
payload) and pass it into jsonb to lock down typings (e.g.,
jsonb<Attachment[]>('attachments')), or if null is not meaningful set a typed
default value instead; update the schema declaration that references attachments
and any related uses of toolName/toolCallId/type to use the new Attachment type
for downstream safety and remove unnecessary null checks.

@ngoiyaeric ngoiyaeric added help wanted Extra attention is needed Bug fix 1.0.0 Standard release labels Feb 1, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

1.0.0 Standard release Bug fix help wanted Extra attention is needed Review effort 3/5

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants