Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
103 changes: 102 additions & 1 deletion app/actions.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ import { BotMessage } from '@/components/message'
import { SearchSection } from '@/components/search-section'
import SearchRelated from '@/components/search-related'
import { GeoJsonLayer } from '@/components/map/geojson-layer'
import { MapDataUpdater } from '@/components/map/map-data-updater'
import { CopilotDisplay } from '@/components/copilot-display'
import RetrieveSection from '@/components/retrieve-section'
import { VideoSearchSection } from '@/components/video-search-section'
Expand Down Expand Up @@ -210,6 +211,37 @@ async function submit(formData?: FormData, skip?: boolean) {
: ((formData?.get('related_query') as string) ||
(formData?.get('input') as string))

let isGeoJsonInput = false
if (userInput) {
try {
const trimmedInput = userInput.trim()
if ((trimmedInput.startsWith('{') && trimmedInput.endsWith('}')) || (trimmedInput.startsWith('[') && trimmedInput.endsWith(']'))) {
const geoJson = JSON.parse(trimmedInput)
if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
isGeoJsonInput = true
const geoJsonId = nanoid()
aiState.update({
...aiState.get(),
messages: [
...aiState.get().messages,
{
id: geoJsonId,
role: 'assistant',
content: JSON.stringify({ data: geoJson, filename: 'Pasted GeoJSON' }),
type: 'geojson_upload'
}
]
})
uiStream.append(
<MapDataUpdater id={geoJsonId} data={geoJson} filename="Pasted GeoJSON" />
)
}
}
} catch (e) {
// Not a valid JSON, ignore
}
}

if (userInput.toLowerCase().trim() === 'what is a planet computer?' || userInput.toLowerCase().trim() === 'what is qcx-terra?') {
const definition = userInput.toLowerCase().trim() === 'what is a planet computer?'
? `A planet computer is a proprietary environment aware system that interoperates weather forecasting, mapping and scheduling using cutting edge multi-agents to streamline automation and exploration on a planet. Available for our Pro and Enterprise customers. [QCX Pricing](https://www.queue.cx/#pricing)`
Expand Down Expand Up @@ -301,6 +333,8 @@ async function submit(formData?: FormData, skip?: boolean) {
}[] = []

if (userInput) {
// If it's a GeoJSON input, we still want to keep it in the message history for the AI to see,
// but we might want to truncate it if it's huge. For now, just pass it.
messageParts.push({ type: 'text', text: userInput })
}

Expand All @@ -315,8 +349,39 @@ async function submit(formData?: FormData, skip?: boolean) {
image: dataUrl,
mimeType: file.type
})
} else if (file.type === 'text/plain') {
} else if (file.type === 'text/plain' || file.name.endsWith('.geojson') || file.type === 'application/geo+json') {
const textContent = Buffer.from(buffer).toString('utf-8')
const isGeoJson = file.name.endsWith('.geojson') || file.type === 'application/geo+json'

if (isGeoJson) {
try {
const geoJson = JSON.parse(textContent)
if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
const geoJsonId = nanoid()
// Add a special message to track the GeoJSON upload
aiState.update({
...aiState.get(),
messages: [
...aiState.get().messages,
{
id: geoJsonId,
role: 'assistant',
content: JSON.stringify({ data: geoJson, filename: file.name }),
type: 'geojson_upload'
}
]
})
Comment on lines +359 to +373
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Consider validating GeoJSON structure more thoroughly.

The current validation only checks geoJson.type. Invalid GeoJSON (e.g., a FeatureCollection with malformed features) could still pass through and cause rendering issues downstream. Consider using a lightweight validation:

🛡️ Optional: Add feature validation
           if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
+            // Basic validation for FeatureCollection
+            if (geoJson.type === 'FeatureCollection' && !Array.isArray(geoJson.features)) {
+              console.warn('Invalid FeatureCollection: missing features array')
+              return
+            }
             const geoJsonId = nanoid()
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
const geoJsonId = nanoid()
// Add a special message to track the GeoJSON upload
aiState.update({
...aiState.get(),
messages: [
...aiState.get().messages,
{
id: geoJsonId,
role: 'assistant',
content: JSON.stringify({ data: geoJson, filename: file.name }),
type: 'geojson_upload'
}
]
})
if (geoJson.type === 'FeatureCollection' || geoJson.type === 'Feature') {
// Basic validation for FeatureCollection
if (geoJson.type === 'FeatureCollection' && !Array.isArray(geoJson.features)) {
console.warn('Invalid FeatureCollection: missing features array')
return
}
const geoJsonId = nanoid()
// Add a special message to track the GeoJSON upload
aiState.update({
...aiState.get(),
messages: [
...aiState.get().messages,
{
id: geoJsonId,
role: 'assistant',
content: JSON.stringify({ data: geoJson, filename: file.name }),
type: 'geojson_upload'
}
]
})
🤖 Prompt for AI Agents
In `@app/actions.tsx` around lines 326 - 340, The current branch that creates a
geojson upload message (around geoJson/type check, nanoid generation and
aiState.update that pushes a message with type 'geojson_upload' and content from
file.name) should validate the GeoJSON more thoroughly before updating aiState:
implement or call a lightweight validator (e.g., a new validateGeoJSON function
or an is-geojson/geojson-validation library) to verify
FeatureCollection.features is an array and each Feature has valid properties
(type==='Feature', geometry with type and coordinates) and that
FeatureCollection has valid bbox/crs if present; if validation fails, avoid
pushing the message, log the error (or surface a user-facing error) and return
early. Ensure you reference/replace the simple geoJson.type check with this
validator in the block that generates geoJsonId and calls aiState.update so only
validated GeoJSON gets stored as type 'geojson_upload'.


// Immediately append the updater to the UI stream
uiStream.append(
<MapDataUpdater id={geoJsonId} data={geoJson} filename={file.name} />
)
}
Comment on lines +360 to +379
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Server action appends a React client component into the UI stream

submit (a server action) is directly appending <MapDataUpdater /> into uiStream. That means a server-side path is deciding to render a client component as a side effect of file parsing.

This is fragile for a few reasons:

  • It tightly couples server parsing with a specific client synchronization mechanism.
  • It risks odd replay behavior (e.g., streaming replays, retries) causing multiple MapDataUpdater mounts.
  • It makes the UI update dependent on server streaming rather than the persisted AI state reconstruction (which you already added via geojson_upload in getUIStateFromAIState).

Given you already insert a geojson_upload message into aiState, UI reconstruction can naturally render the updater from state without also pushing it immediately.

Suggestion

Remove the immediate uiStream.append(<MapDataUpdater ... />) and rely on the geojson_upload message being rendered via getUIStateFromAIState.

If you need instant feedback before the assistant message is produced, consider appending a lightweight server-renderable placeholder (e.g., “Uploaded X”) and let the client handle map updates based on AI state.

Reply with "@CharlieHelps yes please" if you'd like me to add a commit making this change.

} catch (e) {
console.error('Failed to parse GeoJSON:', e)
}
}
Comment on lines +352 to +383
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Silent error handling may confuse users when GeoJSON parsing fails.

When JSON.parse(textContent) fails on Line 325, the error is only logged to the console. The file content still gets prepended to the message (lines 352-357), which may lead to confusing AI responses. Consider providing user feedback:

🛡️ Proposed improvement
         } catch (e) {
           console.error('Failed to parse GeoJSON:', e)
+          // Optionally notify the user
+          uiStream.append(
+            <BotMessage content={createStreamableValue(`⚠️ Failed to parse ${file.name} as valid GeoJSON. The file will be processed as plain text.`).value} />
+          )
         }
🤖 Prompt for AI Agents
In `@app/actions.tsx` around lines 319 - 350, The GeoJSON parse failure is only
console.logged and the raw file content may still be prepended to messages;
modify the catch block around JSON.parse(textContent) to (1) prevent adding the
MapDataUpdater/uiStream entry or the geojson_upload assistant message when
parsing fails and (2) surface a user-facing error via aiState.update (add an
assistant or system message with id like geoJsonId or similar explaining the
file failed to parse) and optionally append a UI error component instead of
MapDataUpdater; reference the existing symbols JSON.parse, geoJsonId,
aiState.update, uiStream.append, and MapDataUpdater to locate and change
behavior.


const existingTextPart = messageParts.find(p => p.type === 'text')
if (existingTextPart) {
existingTextPart.text = `${textContent}\n\n${existingTextPart.text}`
Expand Down Expand Up @@ -624,10 +689,19 @@ export const AI = createAI<AIState, UIState>({
export const getUIStateFromAIState = (aiState: AIState): UIState => {
const chatId = aiState.chatId
const isSharePage = aiState.isSharePage

// Filter messages to only include the last 'data' message if multiple exist
const lastDataMessageIndex = [...aiState.messages].reverse().findIndex(m => m.role === 'data')
const actualLastDataIndex = lastDataMessageIndex === -1 ? -1 : aiState.messages.length - 1 - lastDataMessageIndex

return aiState.messages
.map((message, index) => {
const { role, content, id, type, name } = message

if (role === 'data' && index !== actualLastDataIndex) {
return null
}

if (
!type ||
type === 'end' ||
Expand Down Expand Up @@ -716,6 +790,13 @@ export const getUIStateFromAIState = (aiState: AIState): UIState => {
)
}
}
case 'geojson_upload': {
const { data, filename } = JSON.parse(content as string)
return {
id,
component: <MapDataUpdater id={id} data={data} filename={filename} />
}
}
Comment on lines +793 to +799
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unvalidated JSON.parse for geojson_upload can crash UI rendering

getUIStateFromAIState parses content for geojson_upload without a try/catch. Any malformed content (db corruption, older message format, partial writes) will throw and can break UI state mapping.

You already wrap parsing for other message kinds; this one should be consistent.

Suggestion

Wrap the JSON.parse in a try/catch (similar to the tool and data cases) and return { id, component: null } on failure.

Reply with "@CharlieHelps yes please" if you'd like me to add a commit with this defensive parsing.

Comment on lines +793 to +799
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Missing error handling for malformed geojson_upload content.

The JSON.parse call on line 781 will throw if content is malformed, potentially breaking the entire UI state reconstruction. Other similar cases in this function (e.g., tool role at line 790) wrap parsing in try-catch.

🛡️ Proposed fix
             case 'geojson_upload': {
+              try {
                 const { data, filename } = JSON.parse(content as string)
                 return {
                   id,
                   component: <MapDataUpdater id={id} data={data} filename={filename} />
                 }
+              } catch (e) {
+                console.error('Error parsing geojson_upload content:', e)
+                return { id, component: null }
+              }
             }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
case 'geojson_upload': {
const { data, filename } = JSON.parse(content as string)
return {
id,
component: <MapDataUpdater id={id} data={data} filename={filename} />
}
}
case 'geojson_upload': {
try {
const { data, filename } = JSON.parse(content as string)
return {
id,
component: <MapDataUpdater id={id} data={data} filename={filename} />
}
} catch (e) {
console.error('Error parsing geojson_upload content:', e)
return { id, component: null }
}
}
🤖 Prompt for AI Agents
In `@app/actions.tsx` around lines 780 - 786, The JSON.parse in the
'geojson_upload' case can throw for malformed content; wrap the parse of content
in a try-catch (the case handling that returns component: <MapDataUpdater ...
/>) similar to the other role branches: on success return the existing object
with id and component: <MapDataUpdater id={id} data={data} filename={filename}
/>, on parse failure catch the error, log or report it, and return a safe
fallback entry (e.g., an object with id and a lightweight error/display
component or null component) so UI state reconstruction doesn't crash; reference
the 'geojson_upload' case, JSON.parse(content as string), and MapDataUpdater
when applying the fix.

}
break
case 'tool':
Expand Down Expand Up @@ -775,6 +856,26 @@ export const getUIStateFromAIState = (aiState: AIState): UIState => {
}
}
break
case 'data':
try {
const contextData = JSON.parse(content as string)
if (contextData.uploadedGeoJson && Array.isArray(contextData.uploadedGeoJson)) {
return {
id,
component: (
<>
{contextData.uploadedGeoJson.map((item: any) => (
<MapDataUpdater key={item.id} id={item.id} data={item.data} filename={item.filename} />
))}
</>
)
}
}
return { id, component: null }
} catch (e) {
console.error('Error parsing data message:', e)
return { id, component: null }
}
Comment on lines +859 to +878
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

data message rendering uses any and replays side effects without guarding

In the data case you iterate uploadedGeoJson.map((item: any) => <MapDataUpdater ... />). Two concerns:

  1. Using any here makes it easy to introduce unsafe shapes that still compile. This is UI-state hydration code; it should be strict.
  2. Rendering many MapDataUpdaters will attempt fitBounds for each layer; on chat reload this can cause multiple sequential camera jumps. The current MapDataUpdater only de-dupes by id for inserting into context, but it does not de-dupe the fitBounds side effect. So reload may result in the map fitting to the last item every time, even if the user previously set a different view.
Suggestion
  1. Replace any with a minimal structural type (e.g., { id: string; data: unknown; filename: string }) and validate required fields before rendering.

  2. Add a flag to MapDataUpdater like shouldFlyTo?: boolean and set it to false when replaying persisted uploadedGeoJson from a data message, or add internal logic to only fitBounds on first-ever insert (e.g., only when it was newly added to context).

Reply with "@CharlieHelps yes please" if you'd like me to add a commit implementing shouldFlyTo and tightening the hydration type.

default:
return {
id,
Expand Down
9 changes: 3 additions & 6 deletions app/search/[id]/page.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -48,14 +48,11 @@ export default async function SearchPage({ params }: SearchPageProps) {
const initialMessages: AIMessage[] = dbMessages.map((dbMsg): AIMessage => {
return {
id: dbMsg.id,
role: dbMsg.role as AIMessage['role'], // Cast role, ensure AIMessage['role'] includes all dbMsg.role possibilities
role: dbMsg.role as AIMessage['role'],
content: dbMsg.content,
createdAt: dbMsg.createdAt ? new Date(dbMsg.createdAt) : undefined,
// 'type' and 'name' are not in the basic Drizzle 'messages' schema.
// These would be undefined unless specific logic is added to derive them.
// For instance, if a message with role 'tool' should have a 'name',
// or if some messages have a specific 'type' based on content or other flags.
// This mapping assumes standard user/assistant messages primarily.
type: dbMsg.type as AIMessage['type'],
name: dbMsg.toolName as string,
Comment on lines +51 to +55
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

cd app/search && find . -type f -name "page.tsx" | head -5

Repository: QueueLab/QCX

Length of output: 71


🏁 Script executed:

cd app && find . -type f -name "*.tsx" -o -name "*.ts" | grep -E "(page|types|message)" | head -20

Repository: QueueLab/QCX

Length of output: 89


🏁 Script executed:

cat -n app/search/\[id\]/page.tsx | sed -n '40,65p'

Repository: QueueLab/QCX

Length of output: 1170


🏁 Script executed:

# Find AIMessage type definition
rg -n "type AIMessage|interface AIMessage" --type=ts --type=tsx

Repository: QueueLab/QCX

Length of output: 83


🏁 Script executed:

# Check dbMsg source and context
rg -n "dbMsg" app/search/\[id\]/page.tsx -B 5 -A 2

Repository: QueueLab/QCX

Length of output: 607


🏁 Script executed:

rg -n "type AIMessage|interface AIMessage" -A 10

Repository: QueueLab/QCX

Length of output: 1274


🏁 Script executed:

rg -n "type DrizzleMessage|interface DrizzleMessage" -A 10

Repository: QueueLab/QCX

Length of output: 38


🏁 Script executed:

# Search for where these types are defined
fd -e ts -e tsx | xargs rg "export (type|interface) AIMessage"

Repository: QueueLab/QCX

Length of output: 100


🏁 Script executed:

fd -e ts -e tsx | xargs rg "export (type|interface) DrizzleMessage"

Repository: QueueLab/QCX

Length of output: 40


🏁 Script executed:

# Check for downstream usages of message.name and message.type
rg "\.name\s*\?" -t=ts -t=tsx | head -20

Repository: QueueLab/QCX

Length of output: 83


🏁 Script executed:

# Look for optional chaining or null checks on message fields
rg "message\.(name|type|toolName)" | head -30

Repository: QueueLab/QCX

Length of output: 501


🏁 Script executed:

# Find DrizzleMessage or Message type from Drizzle schema
rg -n "export type Message|export type DrizzleMessage" -A 15

Repository: QueueLab/QCX

Length of output: 1189


🏁 Script executed:

# Check the database schema or Drizzle definitions
fd -e ts -e tsx | xargs rg "messages.*table\|export const messages"

Repository: QueueLab/QCX

Length of output: 40


🏁 Script executed:

# Look at the Drizzle schema file
find . -name "*schema*" -o -name "*drizzle*" | head -10

Repository: QueueLab/QCX

Length of output: 549


🏁 Script executed:

# Check db directory structure
ls -la db/ 2>/dev/null || ls -la lib/db/ 2>/dev/null || find . -type d -name "db" | head -3

Repository: QueueLab/QCX

Length of output: 349


🏁 Script executed:

cat -n lib/db/schema.ts

Repository: QueueLab/QCX

Length of output: 4470


🏁 Script executed:

# Check the actual schema definition for the messages table
cat -n lib/schema/messages.ts 2>/dev/null || cat -n lib/db/schema.ts | grep -A 30 "messages"

Repository: QueueLab/QCX

Length of output: 3827


🏁 Script executed:

# Check the AIMessage type definition more completely
cat -n lib/types/index.ts | sed -n '59,75p'

Repository: QueueLab/QCX

Length of output: 614


🏁 Script executed:

# Check if there are any null checks on message.type or message.name in the codebase
rg "message\.(type|name)\s*\?" -A 2 -B 2

Repository: QueueLab/QCX

Length of output: 38


🏁 Script executed:

# Check how the initialMessages are used downstream
rg "initialMessages" app/search/\[id\]/page.tsx -A 5 -B 5

Repository: QueueLab/QCX

Length of output: 836


🏁 Script executed:

# Check how the type and name fields are accessed in downstream code
rg "\.type\s*==|\.type\s*!=|\.type\s*\?|\.name\s*==|\.name\s*!=" app/ -A 2 -B 2

Repository: QueueLab/QCX

Length of output: 3401


🏁 Script executed:

# Look at the AI component to see how it uses these fields
rg -n "initialAIState" app/ -A 10 | head -50

Repository: QueueLab/QCX

Length of output: 1910


🏁 Script executed:

# Check if there are assertions or strict equality checks on type/name
rg "(type|name)\s*!==" app/ -B 2 -A 2

Repository: QueueLab/QCX

Length of output: 981


Convert nullable database fields to undefined when mapping to AIMessage.

Lines 54–55 cast dbMsg.type and dbMsg.toolName without converting null values to undefined. Since the database schema defines these columns as nullable (no .notNull() constraint) and the AIMessage type expects undefined for optional fields, assign null-coalesced values to maintain type consistency.

Recommended fix
-      type: dbMsg.type as AIMessage['type'],
-      name: dbMsg.toolName as string,
+      type: dbMsg.type ?? undefined,
+      name: dbMsg.toolName ?? undefined,
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
role: dbMsg.role as AIMessage['role'],
content: dbMsg.content,
createdAt: dbMsg.createdAt ? new Date(dbMsg.createdAt) : undefined,
// 'type' and 'name' are not in the basic Drizzle 'messages' schema.
// These would be undefined unless specific logic is added to derive them.
// For instance, if a message with role 'tool' should have a 'name',
// or if some messages have a specific 'type' based on content or other flags.
// This mapping assumes standard user/assistant messages primarily.
type: dbMsg.type as AIMessage['type'],
name: dbMsg.toolName as string,
role: dbMsg.role as AIMessage['role'],
content: dbMsg.content,
createdAt: dbMsg.createdAt ? new Date(dbMsg.createdAt) : undefined,
type: dbMsg.type ?? undefined,
name: dbMsg.toolName ?? undefined,
🤖 Prompt for AI Agents
In `@app/search/`[id]/page.tsx around lines 51 - 55, The mapping of DB records to
AIMessage in page.tsx currently casts nullable columns directly (dbMsg.type,
dbMsg.toolName) which can leave nulls where AIMessage expects undefined; update
the mapping in the function that constructs the AIMessage object (the block
assigning role/content/createdAt/type/name) to convert nullable values to
undefined (e.g., use null-coalescing like dbMsg.type ?? undefined and
dbMsg.toolName ?? undefined or conditional expressions) so type and name are
undefined when the DB returns null.

};
});

Expand Down
4 changes: 2 additions & 2 deletions bun.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

5 changes: 3 additions & 2 deletions components/chat-panel.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ export const ChatPanel = forwardRef<ChatPanelRef, ChatPanelProps>(({ messages, i
}
}, 500) // 500ms debounce delay
},
[mapData]
[mapData, setSuggestions]
)

useEffect(() => {
Expand Down Expand Up @@ -213,7 +213,8 @@ export const ChatPanel = forwardRef<ChatPanelRef, ChatPanelProps>(({ messages, i
ref={fileInputRef}
onChange={handleFileChange}
className="hidden"
accept="text/plain,image/png,image/jpeg,image/webp"
accept="text/plain,image/png,image/jpeg,image/webp,.geojson,application/geo+json"
data-testid="file-upload-input"
/>
{!isMobile && (
<Button
Expand Down
36 changes: 25 additions & 11 deletions components/chat.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -73,14 +73,18 @@ export function Chat({ id }: ChatProps) {
}, [id, path, messages])

useEffect(() => {
if (aiState.messages[aiState.messages.length - 1]?.type === 'response') {
const lastMessage = aiState.messages[aiState.messages.length - 1];
if (lastMessage?.type === 'response' && lastMessage.id !== lastRefreshedMessageIdRef.current) {
// Refresh the page to chat history updates
lastRefreshedMessageIdRef.current = lastMessage.id;
router.refresh()
}
}, [aiState, router])

// Get mapData to access drawnFeatures
const { mapData } = useMapData();
const lastSyncedDataRef = useRef<string>('');
const lastRefreshedMessageIdRef = useRef<string | null>(null);

useEffect(() => {
if (isSubmitting) {
Expand All @@ -89,21 +93,31 @@ export function Chat({ id }: ChatProps) {
}
}, [isSubmitting])

// useEffect to call the server action when drawnFeatures changes
// useEffect to call the server action when drawnFeatures or uploadedGeoJson changes
useEffect(() => {
if (id && mapData.drawnFeatures && mapData.cameraState) {
console.log('Chat.tsx: drawnFeatures changed, calling updateDrawingContext', mapData.drawnFeatures);
updateDrawingContext(id, {
drawnFeatures: mapData.drawnFeatures,
if (id && (mapData.drawnFeatures || mapData.uploadedGeoJson) && mapData.cameraState) {
const currentData = JSON.stringify({
drawnFeatures: mapData.drawnFeatures || [],
cameraState: mapData.cameraState,
uploadedGeoJson: (mapData.uploadedGeoJson || []).map(item => ({ id: item.id, visible: item.visible }))
});

if (currentData !== lastSyncedDataRef.current) {
console.log('Chat.tsx: map data changed, calling updateDrawingContext');
lastSyncedDataRef.current = currentData;
updateDrawingContext(id, {
drawnFeatures: mapData.drawnFeatures || [],
cameraState: mapData.cameraState,
uploadedGeoJson: mapData.uploadedGeoJson || []
});
}
}
}, [id, mapData.drawnFeatures, mapData.cameraState]);
}, [id, mapData.drawnFeatures, mapData.cameraState, mapData.uploadedGeoJson]);

// Mobile layout
if (isMobile) {
return (
<MapDataProvider> {/* Add Provider */}
<>
<HeaderSearchButton />
<div className="mobile-layout-container">
<div className="mobile-map-section">
Expand Down Expand Up @@ -157,13 +171,13 @@ export function Chat({ id }: ChatProps) {
)}
</div>
</div>
</MapDataProvider>
</>
);
}

// Desktop layout
return (
<MapDataProvider> {/* Add Provider */}
<>
<HeaderSearchButton />
<div className="flex justify-start items-start">
{/* This is the new div for scrolling */}
Expand Down Expand Up @@ -221,6 +235,6 @@ export function Chat({ id }: ChatProps) {
{activeView ? <SettingsView /> : <MapProvider />}
</div>
</div>
</MapDataProvider>
</>
);
}
3 changes: 3 additions & 0 deletions components/map/google-map.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,9 @@ export function GoogleMapComponent() {
mode="SATELLITE"
/>
<GoogleGeoJsonLayer data={featureCollection} />
{mapData.uploadedGeoJson?.map(item => (
item.visible && <GoogleGeoJsonLayer key={item.id} data={item.data} />
))}
Comment on lines +84 to +86
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Conditional rendering inside .map() can return false to the array.

When item.visible is falsy, the expression item.visible && <GoogleGeoJsonLayer ... /> returns false, which React will include in the rendered output (though it won't display anything). For cleaner behavior, filter before mapping or use a ternary with null:

♻️ Suggested refactor
-      {mapData.uploadedGeoJson?.map(item => (
-        item.visible && <GoogleGeoJsonLayer key={item.id} data={item.data} />
-      ))}
+      {mapData.uploadedGeoJson
+        ?.filter(item => item.visible)
+        .map(item => (
+          <GoogleGeoJsonLayer key={item.id} data={item.data} />
+        ))}
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
{mapData.uploadedGeoJson?.map(item => (
item.visible && <GoogleGeoJsonLayer key={item.id} data={item.data} />
))}
{mapData.uploadedGeoJson
?.filter(item => item.visible)
.map(item => (
<GoogleGeoJsonLayer key={item.id} data={item.data} />
))}
🤖 Prompt for AI Agents
In `@components/map/google-map.tsx` around lines 84 - 86, The map callback
currently uses a logical && which can push false values into the JSX array;
update the rendering of mapData.uploadedGeoJson so you either filter out
invisible items first (e.g., uploadedGeoJson.filter(item =>
item.visible).map(item => <GoogleGeoJsonLayer key={item.id} data={item.data}
/>)) or use a ternary to return null for invisible items (item.visible ?
<GoogleGeoJsonLayer key={item.id} data={item.data} /> : null); apply this change
where mapData.uploadedGeoJson is iterated to ensure only valid React nodes (no
false values) are returned.

</APIProvider>
)
}
12 changes: 11 additions & 1 deletion components/map/map-data-context.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,12 @@ export interface MapData {
longitude: number;
title?: string;
}>;
uploadedGeoJson?: Array<{
id: string;
filename: string;
data: any; // FeatureCollection
visible: boolean;
}>;
Comment on lines +32 to +37
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Consider typing data as FeatureCollection for type safety.

The data: any type loses type safety. Based on the GeoJSON layer components (e.g., GeoJsonLayerProps in geojson-layer.tsx), this should be a GeoJSON FeatureCollection.

♻️ Suggested type improvement
+import type { FeatureCollection } from 'geojson';
+
 // ... in MapData interface
   uploadedGeoJson?: Array<{
     id: string;
     filename: string;
-    data: any; // FeatureCollection
+    data: FeatureCollection;
     visible: boolean;
   }>;
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
uploadedGeoJson?: Array<{
id: string;
filename: string;
data: any; // FeatureCollection
visible: boolean;
}>;
uploadedGeoJson?: Array<{
id: string;
filename: string;
data: FeatureCollection;
visible: boolean;
}>;
🤖 Prompt for AI Agents
In `@components/map/map-data-context.tsx` around lines 32 - 37, The
uploadedGeoJson entry uses data: any which loses type safety; update the type in
map-data-context.tsx to use GeoJSON's FeatureCollection (import
FeatureCollection from 'geojson' or the project's GeoJSON types) so
uploadedGeoJson?: Array<{ id: string; filename: string; data: FeatureCollection;
visible: boolean; }>; reference the existing GeoJsonLayerProps in
geojson-layer.tsx to ensure the FeatureCollection generic matches the layer's
expected geometry/props types and adjust imports accordingly.

Comment on lines +32 to +37
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

uploadedGeoJson is typed as data: any in context and MapDataUpdaterProps, and geoJson?: any in tool output. Since this pipeline crosses persistence boundaries and executes viewport operations, this is a good place to tighten runtime validation and narrow types.

Even a lightweight validation (checking type and features array) would prevent downstream runtime errors and reduce the surface area for malicious/accidental payloads.

Suggestion

Replace any with Feature | FeatureCollection (from geojson) and add a small runtime validator at ingestion points (upload + tool) that rejects anything else.

Example types:

import type { Feature, FeatureCollection } from 'geojson'

data: FeatureCollection | Feature

Reply with "@CharlieHelps yes please" if you’d like me to add a commit that tightens types and adds a shared isGeoJsonFeatureOrCollection() helper.

}

interface MapDataContextType {
Expand All @@ -39,7 +45,11 @@ interface MapDataContextType {
const MapDataContext = createContext<MapDataContextType | undefined>(undefined);

export const MapDataProvider: React.FC<{ children: ReactNode }> = ({ children }) => {
const [mapData, setMapData] = useState<MapData>({ drawnFeatures: [], markers: [] });
const [mapData, setMapData] = useState<MapData>({
drawnFeatures: [],
markers: [],
uploadedGeoJson: []
});

return (
<MapDataContext.Provider value={{ mapData, setMapData }}>
Expand Down
65 changes: 65 additions & 0 deletions components/map/map-data-updater.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
'use client';

import { useEffect, useRef } from 'react';
import { useMapData } from './map-data-context';
import { useMap } from './map-context';
import type { FeatureCollection } from 'geojson';
import * as turf from '@turf/turf';

interface MapDataUpdaterProps {
id: string;
data: any; // FeatureCollection or Feature
filename: string;
}
Comment on lines +9 to +13
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Consider stronger typing for the data prop.

Using any here loses type safety. Consider using a union type to be explicit about expected inputs:

+import type { Feature } from 'geojson';
+
 interface MapDataUpdaterProps {
   id: string;
-  data: any; // FeatureCollection or Feature
+  data: FeatureCollection | Feature | null;
   filename: string;
 }
🤖 Prompt for AI Agents
In `@components/map/map-data-updater.tsx` around lines 9 - 13, The
MapDataUpdaterProps interface uses a loose any for the data prop; change it to
an explicit GeoJSON union (e.g., GeoJSON.FeatureCollection | GeoJSON.Feature) by
importing the GeoJSON types and replacing the data: any declaration in
MapDataUpdaterProps with the appropriate union type; also update any usages of
the data prop (in the MapDataUpdater component and callers) to satisfy the
stronger type (narrow with type guards where you expect Feature vs
FeatureCollection).


export function MapDataUpdater({ id, data, filename }: MapDataUpdaterProps) {
const { setMapData } = useMapData();
const { map } = useMap();
const hasZoomedRef = useRef(false);

useEffect(() => {
if (!data) return;

// Ensure it's a FeatureCollection for consistency
const featureCollection: FeatureCollection = data.type === 'FeatureCollection'
? data
: { type: 'FeatureCollection', features: [data] };
Comment on lines +23 to +26
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Potential silent failure with invalid GeoJSON data.

The normalization logic assumes data is either a FeatureCollection or a valid Feature. If data has an unexpected structure (e.g., data.type is "Point" or "Polygon" — raw geometries rather than Features), wrapping it as { type: 'FeatureCollection', features: [data] } will produce an invalid FeatureCollection that may cause downstream rendering issues.

Consider adding validation:

🛡️ Proposed validation
+    // Validate that data is a Feature or FeatureCollection
+    if (data.type !== 'FeatureCollection' && data.type !== 'Feature') {
+      console.warn('MapDataUpdater: Invalid GeoJSON type, expected Feature or FeatureCollection:', data.type);
+      return;
+    }
+
     // Ensure it's a FeatureCollection for consistency
     const featureCollection: FeatureCollection = data.type === 'FeatureCollection'
       ? data
       : { type: 'FeatureCollection', features: [data] };
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// Ensure it's a FeatureCollection for consistency
const featureCollection: FeatureCollection = data.type === 'FeatureCollection'
? data
: { type: 'FeatureCollection', features: [data] };
// Validate that data is a Feature or FeatureCollection
if (data.type !== 'FeatureCollection' && data.type !== 'Feature') {
console.warn('MapDataUpdater: Invalid GeoJSON type, expected Feature or FeatureCollection:', data.type);
return;
}
// Ensure it's a FeatureCollection for consistency
const featureCollection: FeatureCollection = data.type === 'FeatureCollection'
? data
: { type: 'FeatureCollection', features: [data] };
🤖 Prompt for AI Agents
In `@components/map/map-data-updater.tsx` around lines 22 - 25, The normalization
blindly treats non-FeatureCollection inputs as Features and can produce invalid
GeoJSON; update the logic around the featureCollection variable to validate the
incoming data.type: if data.type === 'FeatureCollection' keep it, if data.type
=== 'Feature' use it directly, if data.type is a GeoJSON Geometry type (e.g.,
'Point','Polygon','LineString', etc.) wrap it in a Feature object ({ type:
'Feature', geometry: data, properties: {} }), and otherwise handle invalid input
by logging an error via the same logger and returning or falling back to an
empty FeatureCollection ({ type: 'FeatureCollection', features: [] }) so
downstream consumers only ever receive a valid FeatureCollection of Feature
objects. Ensure this uses the same variable names (featureCollection, data) so
the change is easy to locate.


// Update MapData context
setMapData(prev => {
// Avoid duplicate entries
const alreadyExists = prev.uploadedGeoJson?.some(item => item.id === id);
if (alreadyExists) return prev;

return {
...prev,
uploadedGeoJson: [
...(prev.uploadedGeoJson || []),
{
id,
filename,
data: featureCollection,
visible: true
}
]
};
});

// Fly to the extent of the GeoJSON
if (map && featureCollection.features.length > 0 && !hasZoomedRef.current) {
hasZoomedRef.current = true;
try {
const bbox = turf.bbox(featureCollection);
map.fitBounds(bbox as [number, number, number, number], {
padding: 50,
maxZoom: 15,
duration: 2000
});
} catch (e) {
console.error('Failed to fit bounds for GeoJSON:', e);
}
}
}, [id, data, filename, setMapData, map]);

return null; // Headless component
}
Loading