Conversation
# Conflicts: # requirements/project-requirements.txt
…l-phishing-analysis-framework
# Conflicts: # requirements/project-requirements.txt
…aybook-phishing-analysis
…hing_form_compiler.py Co-authored-by: Matteo Lodi <[email protected]>
api_app/analyzers_manager/file_analyzers/phishing/phishing_form_compiler.py
Fixed
Show fixed
Hide fixed
| ) | ||
|
|
||
| logger.info( | ||
| f"Job #{self.job_id}: Sending value {value_to_set} for {input_name=}" |
Check failure
Code scanning / CodeQL
Clear-text logging of sensitive information
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI over 1 year ago
To fix the problem, we need to ensure that sensitive information such as passwords is not logged in clear text. Instead of logging the actual value of sensitive fields, we can log a placeholder or a generic message indicating that a sensitive value was processed. This way, we maintain the logging functionality without exposing sensitive data.
The best way to fix the problem without changing existing functionality is to modify the logging statements to exclude sensitive data. Specifically, we will replace the actual values of sensitive fields with a placeholder like "[REDACTED]" or a similar message.
| @@ -140,3 +140,3 @@ | ||
| logger.info( | ||
| f"Job #{self.job_id}: Found hidden input tag with {input_name=} and {input_value=}" | ||
| f"Job #{self.job_id}: Found hidden input tag with {input_name=} and value [REDACTED]" | ||
| ) | ||
| @@ -157,5 +157,10 @@ | ||
|
|
||
| logger.info( | ||
| f"Job #{self.job_id}: Sending value {value_to_set} for {input_name=}" | ||
| ) | ||
| if input_type.lower() == "password": | ||
| logger.info( | ||
| f"Job #{self.job_id}: Sending value [REDACTED] for {input_name=}" | ||
| ) | ||
| else: | ||
| logger.info( | ||
| f"Job #{self.job_id}: Sending value {value_to_set} for {input_name=}" | ||
| ) | ||
| result.setdefault(input_name, value_to_set) | ||
| @@ -165,3 +170,4 @@ | ||
| params, dest_url = self.compile_form_field(form) | ||
| logger.info(f"Job #{self.job_id}: Sending {params=} to submit url {dest_url}") | ||
| redacted_params = {k: (v if k.lower() != "password" else "[REDACTED]") for k, v in params.items()} | ||
| logger.info(f"Job #{self.job_id}: Sending {redacted_params=} to submit url {dest_url}") | ||
| return requests.post( |
…l-phishing-analysis-framework
Description
Added a framework for dumping information about web pages and interact with phishing pages.
Type of change
Please delete options that are not relevant.
Checklist
developdumpplugincommand and added it in the project as a data migration. ("How to share a plugin with the community")test_files.zipand you added the default tests for that mimetype in test_classes.py.FREE_TO_USE_ANALYZERSplaybook by following this guide.urlthat contains this information. This is required for Health Checks._monkeypatch()was used in its class to apply the necessary decorators.MockUpResponseof the_monkeypatch()method. This serves us to provide a valid sample for testing.Black,Flake,Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.testsfolder). All the tests (new and old ones) gave 0 errors.DeepSource,Django Doctorsor other third-party linters have triggered any alerts during the CI checks, I have solved those alerts.Important Rules