Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 8 additions & 3 deletions src/client.ts
Original file line number Diff line number Diff line change
Expand Up @@ -344,9 +344,14 @@ export class Pushy {
const remoteEndpoints = await resp.json();
log('fetch endpoints:', remoteEndpoints);
if (Array.isArray(remoteEndpoints)) {
server.backups = Array.from(
new Set([...(server.backups || []), ...remoteEndpoints]),
);
const backups = server.backups || [];
const set = new Set(backups);
for (const endpoint of remoteEndpoints) {
set.add(endpoint);
}
if (set.size !== backups.length) {
server.backups = Array.from(set);
}
Comment on lines +347 to +354
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Fix merge-change detection when backups already contains duplicates.

Line 352 compares against backups.length, which can miss real additions.
Example: ['a','a','b'] + ['c'] yields set.size === 3 and skips assignment, so 'c' is lost.

Proposed fix
-          const backups = server.backups || [];
-          const set = new Set(backups);
+          const backups = server.backups || [];
+          const set = new Set(backups);
+          const initialUniqueSize = set.size;
           for (const endpoint of remoteEndpoints) {
             set.add(endpoint);
           }
-          if (set.size !== backups.length) {
+          const hadDuplicates = backups.length !== initialUniqueSize;
+          const addedNewEndpoints = set.size !== initialUniqueSize;
+          if (hadDuplicates || addedNewEndpoints) {
             server.backups = Array.from(set);
           }
πŸ“ Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const backups = server.backups || [];
const set = new Set(backups);
for (const endpoint of remoteEndpoints) {
set.add(endpoint);
}
if (set.size !== backups.length) {
server.backups = Array.from(set);
}
const backups = server.backups || [];
const set = new Set(backups);
const initialUniqueSize = set.size;
for (const endpoint of remoteEndpoints) {
set.add(endpoint);
}
const hadDuplicates = backups.length !== initialUniqueSize;
const addedNewEndpoints = set.size !== initialUniqueSize;
if (hadDuplicates || addedNewEndpoints) {
server.backups = Array.from(set);
}
πŸ€– Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/client.ts` around lines 347 - 354, The current merge logic uses
backups.length to detect changes but that fails when backups already contains
duplicates; compute the deduplicated original size first (e.g., const
originalSize = new Set(backups).size), then add remoteEndpoints to the set and
compare set.size !== originalSize to decide whether to assign server.backups =
Array.from(set); reference the variables backups, remoteEndpoints,
server.backups and the Set used to dedupe.

}
} catch (e: any) {
log('failed to fetch endpoints from: ', server.queryUrls);
Expand Down