reply: normalize Windows file paths for media dedupe#37425
reply: normalize Windows file paths for media dedupe#37425shuofengzhang wants to merge 1 commit intoopenclaw:mainfrom
Conversation
Greptile SummaryThis PR adds Windows file path normalization to the media deduplication logic in The implementation introduces
Confidence Score: 5/5
Last reviewed commit: 98731af |
98731af to
cbce322
Compare
2356f66 to
5256d41
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 5256d41c60
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| if (!path) { | ||
| return ""; | ||
| } | ||
| let normalized = path.replace(/\\/g, "/"); |
There was a problem hiding this comment.
Restrict backslash normalization to Windows-style paths
Replacing every \\ with / changes semantics for valid POSIX file names that contain a literal backslash, so distinct files can be treated as duplicates and dropped. For example on Linux/macOS, a previously sent URL like file:///tmp/a%5Cb.jpg normalizes to /tmp/a/b.jpg, which then matches a different payload path /tmp/a/b.jpg and suppresses a non-duplicate media send. This regression is introduced by applying filesystem normalization unconditionally instead of only for Windows-style paths/URLs.
Useful? React with 👍 / 👎.
|
Closing this PR because the author has more than 10 active PRs in this repo. Please reduce the active PR queue and reopen or resubmit once it is back under the limit. You can close your own PRs to get back under the limit. |
What changed
filterMessagingToolMediaDuplicates.\\) vs slash (/) separatorsfile:///C:/...vs localC:\.../C:/...formsC:vsc:)file://URL equivalence.Why
file://URLs, causing duplicate media payloads.Testing
scripts/clone_and_test.sh openclaw/openclaw(pass:16 passed)npx vitest --run src/auto-reply/reply/reply-payloads.test.ts(pass:17 passed)