Skip to content

Bump js-yaml from 4.1.0 to 4.1.1 in /sql2csv.web#2

Merged
markhazleton merged 1 commit intomainfrom
dependabot/npm_and_yarn/sql2csv.web/js-yaml-4.1.1
Jan 12, 2026
Merged

Bump js-yaml from 4.1.0 to 4.1.1 in /sql2csv.web#2
markhazleton merged 1 commit intomainfrom
dependabot/npm_and_yarn/sql2csv.web/js-yaml-4.1.1

Conversation

@dependabot
Copy link
Copy Markdown
Contributor

@dependabot dependabot Bot commented on behalf of github Nov 15, 2025

Bumps js-yaml from 4.1.0 to 4.1.1.

Changelog

Sourced from js-yaml's changelog.

[4.1.1] - 2025-11-12

Security

  • Fix prototype pollution issue in yaml merge (<<) operator.
Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    You can disable automated security fix PRs for this repo from the Security Alerts page.

@dependabot dependabot Bot added dependencies Pull requests that update a dependency file javascript Pull requests that update javascript code labels Nov 15, 2025
@markhazleton
Copy link
Copy Markdown
Owner

@dependabot rebase

@dependabot dependabot Bot force-pushed the dependabot/npm_and_yarn/sql2csv.web/js-yaml-4.1.1 branch from 8d9cde1 to 60ea0b7 Compare January 12, 2026 12:33
@markhazleton
Copy link
Copy Markdown
Owner

@dependabot rebase

@dependabot dependabot Bot force-pushed the dependabot/npm_and_yarn/sql2csv.web/js-yaml-4.1.1 branch from 60ea0b7 to 5afeb55 Compare January 12, 2026 12:38
@markhazleton
Copy link
Copy Markdown
Owner

@dependabot rebase

Bumps [js-yaml](https://github.com/nodeca/js-yaml) from 4.1.0 to 4.1.1.
- [Changelog](https://github.com/nodeca/js-yaml/blob/master/CHANGELOG.md)
- [Commits](nodeca/js-yaml@4.1.0...4.1.1)

---
updated-dependencies:
- dependency-name: js-yaml
  dependency-version: 4.1.1
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot Bot force-pushed the dependabot/npm_and_yarn/sql2csv.web/js-yaml-4.1.1 branch from 5afeb55 to 89435f1 Compare January 12, 2026 12:42
@github-actions
Copy link
Copy Markdown

$(cat coverage-summary.txt 2>/dev/null || echo "Coverage unavailable")

@markhazleton markhazleton merged commit 4f263a0 into main Jan 12, 2026
2 checks passed
@markhazleton markhazleton deleted the dependabot/npm_and_yarn/sql2csv.web/js-yaml-4.1.1 branch January 12, 2026 12:43
markhazleton added a commit that referenced this pull request Mar 31, 2026
…racts

Addresses all 16 findings from /speckit.analyze:

CRITICAL:
- C1: Add T113 (integration tests FR-043) + T114 (CI coverage gate) to tasks.md
- C2: Add T115 (constitution.md rename post Phase 1) to tasks.md

HIGH:
- H1: FR-014 updated to 5 formats (add JPEG); T048 updated to match
- H2: plan.md corrective action #2 corrected from UnifiedAnalysisService ->
      DatabaseAnalysisService line ~729 (both Gate Result and Principle VI note)
- H3: T041/T099 scope split clarified; T099 now depends on T041

MEDIUM:
- M1: FR-003 'first rows' -> 'first 10 rows' (matches edge case spec)
- M2: Add T117 (GitHub archival for DataAnalysisDemo) to tasks.md
- M3: Add T118 (pivot 50K perf test SC-005) to tasks.md
- M4: Add T119 (CLI batch test SC-006) to tasks.md
- M5: plan.md Principle V note corrected: only Sql2Csv.Tests missing TreatWarningsAsErrors
- M6: FR-007 now defines data quality score; T037 updated to verify it
- M7: Add T116 (GitHub repo rename) to tasks.md

LOW:
- L1: T028 tightened to 'controller actions'; T103 tightened to 'view forms'
- L2: T007/T009 ordering dependency noted; T009 excludes RootNamespace/AssemblyName
- L3: T017 now covers both .github/copilot-instructions.md files
- L4: T065 specifies ZIP output; contracts/web-api.md adds GET /api/Database/export-all

Total tasks: 112 -> 119
markhazleton added a commit that referenced this pull request Mar 31, 2026
* feat: Add feature specification for DataSpark platform consolidation

- Introduced a comprehensive feature specification document outlining user stories, acceptance criteria, functional requirements, and success criteria for the consolidation of sql2csv, DataAnalysisDemo, and related repositories into a unified DataSpark platform.
- Document includes detailed user scenarios for data upload, exploratory analysis, chart creation, pivot tables, SQLite database tools, AI insights, statistical analysis, CLI automation, and more.
- Established clear requirements for data ingestion, analysis, visualization, and API integration, ensuring a robust foundation for the DataSpark platform.

docs: Create consolidation recommendation document for DataSpark

- Added a detailed analysis and recommendation document for the consolidation of sql2csv and DataAnalysisDemo repositories.
- Provided an executive summary, repository comparison, feature overlap matrix, and a phased approach for rebranding and feature porting.
- Outlined unique features from DataAnalysisDemo worth porting and recommended a unified architecture for the DataSpark solution.

* docs(spec): add DataSpark consolidation plan, tasks, and design artifacts

- plan.md: technical context, constitution check (PASS), project structure
- research.md: 8 topics resolved (namespace rename, SQL fix, API auth, samples)
- data-model.md: 10 entities with fields, types, constraints, validation rules
- contracts/web-api.md: 10 REST API endpoints with request/response schemas
- contracts/cli.md: 4 CLI commands (discover, export, schema, generate)
- quickstart.md: developer getting-started guide
- tasks.md: 112 tasks across 14 phases, organized by 11 user stories
- Updated copilot agent context with tech stack from plan

Feature: 001-dataspark-consolidation

* docs(spec): apply analysis remediation across spec, plan, tasks, contracts

Addresses all 16 findings from /speckit.analyze:

CRITICAL:
- C1: Add T113 (integration tests FR-043) + T114 (CI coverage gate) to tasks.md
- C2: Add T115 (constitution.md rename post Phase 1) to tasks.md

HIGH:
- H1: FR-014 updated to 5 formats (add JPEG); T048 updated to match
- H2: plan.md corrective action #2 corrected from UnifiedAnalysisService ->
      DatabaseAnalysisService line ~729 (both Gate Result and Principle VI note)
- H3: T041/T099 scope split clarified; T099 now depends on T041

MEDIUM:
- M1: FR-003 'first rows' -> 'first 10 rows' (matches edge case spec)
- M2: Add T117 (GitHub archival for DataAnalysisDemo) to tasks.md
- M3: Add T118 (pivot 50K perf test SC-005) to tasks.md
- M4: Add T119 (CLI batch test SC-006) to tasks.md
- M5: plan.md Principle V note corrected: only Sql2Csv.Tests missing TreatWarningsAsErrors
- M6: FR-007 now defines data quality score; T037 updated to verify it
- M7: Add T116 (GitHub repo rename) to tasks.md

LOW:
- L1: T028 tightened to 'controller actions'; T103 tightened to 'view forms'
- L2: T007/T009 ordering dependency noted; T009 excludes RootNamespace/AssemblyName
- L3: T017 now covers both .github/copilot-instructions.md files
- L4: T065 specifies ZIP output; contracts/web-api.md adds GET /api/Database/export-all

Total tasks: 112 -> 119

* Add unit tests for SchemaService and implement API key authentication middleware

- Created SchemaServiceTests to validate the functionality of SchemaService methods, including GetTablesAsync, GetTableNamesAsync, and GenerateSchemaReportAsync.
- Added tests for handling null and invalid connection strings, cancellation tokens, and ensuring correct row counts and schema report formats.
- Implemented ApiKeyAuthMiddleware to enforce API key authentication on API routes, including error handling for missing or invalid keys.
- Added a solution file to organize the project structure and included a strong name key file for assembly signing.

* feat(dataspark): implement database tools, AI hardening, analytics, and CLI consolidation

* docs(spec): add pull request review for DataSpark consolidation with compliance assessment and recommendations

* fix(pr-9): address review findings for security, architecture, async, and logging

* fix(pr-9): close remaining security and architecture review findings

* docs(pr-review): update PR #9 follow-up review report

* fix(pr-9): address remaining findings and enhance security measures in implementation plan

* fix(pr-9): update review metadata and address async I/O discipline violations in DataSpark.Core services

* fix(core,web,tests): resolve PR #9 review findings

* fix(pr-9): update review metadata and improve assessment details in PR documentation

* feat(core): implement database discovery summary service and export packaging service with logging
markhazleton pushed a commit that referenced this pull request Mar 31, 2026
Merging Dependabot security fix for js-yaml prototype pollution vulnerability
markhazleton added a commit that referenced this pull request Mar 31, 2026
* feat: Add feature specification for DataSpark platform consolidation

- Introduced a comprehensive feature specification document outlining user stories, acceptance criteria, functional requirements, and success criteria for the consolidation of sql2csv, DataAnalysisDemo, and related repositories into a unified DataSpark platform.
- Document includes detailed user scenarios for data upload, exploratory analysis, chart creation, pivot tables, SQLite database tools, AI insights, statistical analysis, CLI automation, and more.
- Established clear requirements for data ingestion, analysis, visualization, and API integration, ensuring a robust foundation for the DataSpark platform.

docs: Create consolidation recommendation document for DataSpark

- Added a detailed analysis and recommendation document for the consolidation of sql2csv and DataAnalysisDemo repositories.
- Provided an executive summary, repository comparison, feature overlap matrix, and a phased approach for rebranding and feature porting.
- Outlined unique features from DataAnalysisDemo worth porting and recommended a unified architecture for the DataSpark solution.

* docs(spec): add DataSpark consolidation plan, tasks, and design artifacts

- plan.md: technical context, constitution check (PASS), project structure
- research.md: 8 topics resolved (namespace rename, SQL fix, API auth, samples)
- data-model.md: 10 entities with fields, types, constraints, validation rules
- contracts/web-api.md: 10 REST API endpoints with request/response schemas
- contracts/cli.md: 4 CLI commands (discover, export, schema, generate)
- quickstart.md: developer getting-started guide
- tasks.md: 112 tasks across 14 phases, organized by 11 user stories
- Updated copilot agent context with tech stack from plan

Feature: 001-dataspark-consolidation

* docs(spec): apply analysis remediation across spec, plan, tasks, contracts

Addresses all 16 findings from /speckit.analyze:

CRITICAL:
- C1: Add T113 (integration tests FR-043) + T114 (CI coverage gate) to tasks.md
- C2: Add T115 (constitution.md rename post Phase 1) to tasks.md

HIGH:
- H1: FR-014 updated to 5 formats (add JPEG); T048 updated to match
- H2: plan.md corrective action #2 corrected from UnifiedAnalysisService ->
      DatabaseAnalysisService line ~729 (both Gate Result and Principle VI note)
- H3: T041/T099 scope split clarified; T099 now depends on T041

MEDIUM:
- M1: FR-003 'first rows' -> 'first 10 rows' (matches edge case spec)
- M2: Add T117 (GitHub archival for DataAnalysisDemo) to tasks.md
- M3: Add T118 (pivot 50K perf test SC-005) to tasks.md
- M4: Add T119 (CLI batch test SC-006) to tasks.md
- M5: plan.md Principle V note corrected: only Sql2Csv.Tests missing TreatWarningsAsErrors
- M6: FR-007 now defines data quality score; T037 updated to verify it
- M7: Add T116 (GitHub repo rename) to tasks.md

LOW:
- L1: T028 tightened to 'controller actions'; T103 tightened to 'view forms'
- L2: T007/T009 ordering dependency noted; T009 excludes RootNamespace/AssemblyName
- L3: T017 now covers both .github/copilot-instructions.md files
- L4: T065 specifies ZIP output; contracts/web-api.md adds GET /api/Database/export-all

Total tasks: 112 -> 119

* Add unit tests for SchemaService and implement API key authentication middleware

- Created SchemaServiceTests to validate the functionality of SchemaService methods, including GetTablesAsync, GetTableNamesAsync, and GenerateSchemaReportAsync.
- Added tests for handling null and invalid connection strings, cancellation tokens, and ensuring correct row counts and schema report formats.
- Implemented ApiKeyAuthMiddleware to enforce API key authentication on API routes, including error handling for missing or invalid keys.
- Added a solution file to organize the project structure and included a strong name key file for assembly signing.

* feat(dataspark): implement database tools, AI hardening, analytics, and CLI consolidation

* docs(spec): add pull request review for DataSpark consolidation with compliance assessment and recommendations

* fix(pr-9): address review findings for security, architecture, async, and logging

* fix(pr-9): close remaining security and architecture review findings

* docs(pr-review): update PR #9 follow-up review report

* fix(pr-9): address remaining findings and enhance security measures in implementation plan

* fix(pr-9): update review metadata and address async I/O discipline violations in DataSpark.Core services

* fix(core,web,tests): resolve PR #9 review findings

* fix(pr-9): update review metadata and improve assessment details in PR documentation

* feat(core): implement database discovery summary service and export packaging service with logging
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file javascript Pull requests that update javascript code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant