Bump js-yaml from 4.1.0 to 4.1.1 in /sql2csv.web#2
Merged
markhazleton merged 1 commit intomainfrom Jan 12, 2026
Merged
Conversation
Owner
|
@dependabot rebase |
8d9cde1 to
60ea0b7
Compare
Owner
|
@dependabot rebase |
60ea0b7 to
5afeb55
Compare
Owner
|
@dependabot rebase |
Bumps [js-yaml](https://github.com/nodeca/js-yaml) from 4.1.0 to 4.1.1. - [Changelog](https://github.com/nodeca/js-yaml/blob/master/CHANGELOG.md) - [Commits](nodeca/js-yaml@4.1.0...4.1.1) --- updated-dependencies: - dependency-name: js-yaml dependency-version: 4.1.1 dependency-type: indirect ... Signed-off-by: dependabot[bot] <support@github.com>
5afeb55 to
89435f1
Compare
|
$(cat coverage-summary.txt 2>/dev/null || echo "Coverage unavailable") |
markhazleton
added a commit
that referenced
this pull request
Mar 31, 2026
…racts Addresses all 16 findings from /speckit.analyze: CRITICAL: - C1: Add T113 (integration tests FR-043) + T114 (CI coverage gate) to tasks.md - C2: Add T115 (constitution.md rename post Phase 1) to tasks.md HIGH: - H1: FR-014 updated to 5 formats (add JPEG); T048 updated to match - H2: plan.md corrective action #2 corrected from UnifiedAnalysisService -> DatabaseAnalysisService line ~729 (both Gate Result and Principle VI note) - H3: T041/T099 scope split clarified; T099 now depends on T041 MEDIUM: - M1: FR-003 'first rows' -> 'first 10 rows' (matches edge case spec) - M2: Add T117 (GitHub archival for DataAnalysisDemo) to tasks.md - M3: Add T118 (pivot 50K perf test SC-005) to tasks.md - M4: Add T119 (CLI batch test SC-006) to tasks.md - M5: plan.md Principle V note corrected: only Sql2Csv.Tests missing TreatWarningsAsErrors - M6: FR-007 now defines data quality score; T037 updated to verify it - M7: Add T116 (GitHub repo rename) to tasks.md LOW: - L1: T028 tightened to 'controller actions'; T103 tightened to 'view forms' - L2: T007/T009 ordering dependency noted; T009 excludes RootNamespace/AssemblyName - L3: T017 now covers both .github/copilot-instructions.md files - L4: T065 specifies ZIP output; contracts/web-api.md adds GET /api/Database/export-all Total tasks: 112 -> 119
markhazleton
added a commit
that referenced
this pull request
Mar 31, 2026
* feat: Add feature specification for DataSpark platform consolidation - Introduced a comprehensive feature specification document outlining user stories, acceptance criteria, functional requirements, and success criteria for the consolidation of sql2csv, DataAnalysisDemo, and related repositories into a unified DataSpark platform. - Document includes detailed user scenarios for data upload, exploratory analysis, chart creation, pivot tables, SQLite database tools, AI insights, statistical analysis, CLI automation, and more. - Established clear requirements for data ingestion, analysis, visualization, and API integration, ensuring a robust foundation for the DataSpark platform. docs: Create consolidation recommendation document for DataSpark - Added a detailed analysis and recommendation document for the consolidation of sql2csv and DataAnalysisDemo repositories. - Provided an executive summary, repository comparison, feature overlap matrix, and a phased approach for rebranding and feature porting. - Outlined unique features from DataAnalysisDemo worth porting and recommended a unified architecture for the DataSpark solution. * docs(spec): add DataSpark consolidation plan, tasks, and design artifacts - plan.md: technical context, constitution check (PASS), project structure - research.md: 8 topics resolved (namespace rename, SQL fix, API auth, samples) - data-model.md: 10 entities with fields, types, constraints, validation rules - contracts/web-api.md: 10 REST API endpoints with request/response schemas - contracts/cli.md: 4 CLI commands (discover, export, schema, generate) - quickstart.md: developer getting-started guide - tasks.md: 112 tasks across 14 phases, organized by 11 user stories - Updated copilot agent context with tech stack from plan Feature: 001-dataspark-consolidation * docs(spec): apply analysis remediation across spec, plan, tasks, contracts Addresses all 16 findings from /speckit.analyze: CRITICAL: - C1: Add T113 (integration tests FR-043) + T114 (CI coverage gate) to tasks.md - C2: Add T115 (constitution.md rename post Phase 1) to tasks.md HIGH: - H1: FR-014 updated to 5 formats (add JPEG); T048 updated to match - H2: plan.md corrective action #2 corrected from UnifiedAnalysisService -> DatabaseAnalysisService line ~729 (both Gate Result and Principle VI note) - H3: T041/T099 scope split clarified; T099 now depends on T041 MEDIUM: - M1: FR-003 'first rows' -> 'first 10 rows' (matches edge case spec) - M2: Add T117 (GitHub archival for DataAnalysisDemo) to tasks.md - M3: Add T118 (pivot 50K perf test SC-005) to tasks.md - M4: Add T119 (CLI batch test SC-006) to tasks.md - M5: plan.md Principle V note corrected: only Sql2Csv.Tests missing TreatWarningsAsErrors - M6: FR-007 now defines data quality score; T037 updated to verify it - M7: Add T116 (GitHub repo rename) to tasks.md LOW: - L1: T028 tightened to 'controller actions'; T103 tightened to 'view forms' - L2: T007/T009 ordering dependency noted; T009 excludes RootNamespace/AssemblyName - L3: T017 now covers both .github/copilot-instructions.md files - L4: T065 specifies ZIP output; contracts/web-api.md adds GET /api/Database/export-all Total tasks: 112 -> 119 * Add unit tests for SchemaService and implement API key authentication middleware - Created SchemaServiceTests to validate the functionality of SchemaService methods, including GetTablesAsync, GetTableNamesAsync, and GenerateSchemaReportAsync. - Added tests for handling null and invalid connection strings, cancellation tokens, and ensuring correct row counts and schema report formats. - Implemented ApiKeyAuthMiddleware to enforce API key authentication on API routes, including error handling for missing or invalid keys. - Added a solution file to organize the project structure and included a strong name key file for assembly signing. * feat(dataspark): implement database tools, AI hardening, analytics, and CLI consolidation * docs(spec): add pull request review for DataSpark consolidation with compliance assessment and recommendations * fix(pr-9): address review findings for security, architecture, async, and logging * fix(pr-9): close remaining security and architecture review findings * docs(pr-review): update PR #9 follow-up review report * fix(pr-9): address remaining findings and enhance security measures in implementation plan * fix(pr-9): update review metadata and address async I/O discipline violations in DataSpark.Core services * fix(core,web,tests): resolve PR #9 review findings * fix(pr-9): update review metadata and improve assessment details in PR documentation * feat(core): implement database discovery summary service and export packaging service with logging
markhazleton
pushed a commit
that referenced
this pull request
Mar 31, 2026
Merging Dependabot security fix for js-yaml prototype pollution vulnerability
markhazleton
added a commit
that referenced
this pull request
Mar 31, 2026
* feat: Add feature specification for DataSpark platform consolidation - Introduced a comprehensive feature specification document outlining user stories, acceptance criteria, functional requirements, and success criteria for the consolidation of sql2csv, DataAnalysisDemo, and related repositories into a unified DataSpark platform. - Document includes detailed user scenarios for data upload, exploratory analysis, chart creation, pivot tables, SQLite database tools, AI insights, statistical analysis, CLI automation, and more. - Established clear requirements for data ingestion, analysis, visualization, and API integration, ensuring a robust foundation for the DataSpark platform. docs: Create consolidation recommendation document for DataSpark - Added a detailed analysis and recommendation document for the consolidation of sql2csv and DataAnalysisDemo repositories. - Provided an executive summary, repository comparison, feature overlap matrix, and a phased approach for rebranding and feature porting. - Outlined unique features from DataAnalysisDemo worth porting and recommended a unified architecture for the DataSpark solution. * docs(spec): add DataSpark consolidation plan, tasks, and design artifacts - plan.md: technical context, constitution check (PASS), project structure - research.md: 8 topics resolved (namespace rename, SQL fix, API auth, samples) - data-model.md: 10 entities with fields, types, constraints, validation rules - contracts/web-api.md: 10 REST API endpoints with request/response schemas - contracts/cli.md: 4 CLI commands (discover, export, schema, generate) - quickstart.md: developer getting-started guide - tasks.md: 112 tasks across 14 phases, organized by 11 user stories - Updated copilot agent context with tech stack from plan Feature: 001-dataspark-consolidation * docs(spec): apply analysis remediation across spec, plan, tasks, contracts Addresses all 16 findings from /speckit.analyze: CRITICAL: - C1: Add T113 (integration tests FR-043) + T114 (CI coverage gate) to tasks.md - C2: Add T115 (constitution.md rename post Phase 1) to tasks.md HIGH: - H1: FR-014 updated to 5 formats (add JPEG); T048 updated to match - H2: plan.md corrective action #2 corrected from UnifiedAnalysisService -> DatabaseAnalysisService line ~729 (both Gate Result and Principle VI note) - H3: T041/T099 scope split clarified; T099 now depends on T041 MEDIUM: - M1: FR-003 'first rows' -> 'first 10 rows' (matches edge case spec) - M2: Add T117 (GitHub archival for DataAnalysisDemo) to tasks.md - M3: Add T118 (pivot 50K perf test SC-005) to tasks.md - M4: Add T119 (CLI batch test SC-006) to tasks.md - M5: plan.md Principle V note corrected: only Sql2Csv.Tests missing TreatWarningsAsErrors - M6: FR-007 now defines data quality score; T037 updated to verify it - M7: Add T116 (GitHub repo rename) to tasks.md LOW: - L1: T028 tightened to 'controller actions'; T103 tightened to 'view forms' - L2: T007/T009 ordering dependency noted; T009 excludes RootNamespace/AssemblyName - L3: T017 now covers both .github/copilot-instructions.md files - L4: T065 specifies ZIP output; contracts/web-api.md adds GET /api/Database/export-all Total tasks: 112 -> 119 * Add unit tests for SchemaService and implement API key authentication middleware - Created SchemaServiceTests to validate the functionality of SchemaService methods, including GetTablesAsync, GetTableNamesAsync, and GenerateSchemaReportAsync. - Added tests for handling null and invalid connection strings, cancellation tokens, and ensuring correct row counts and schema report formats. - Implemented ApiKeyAuthMiddleware to enforce API key authentication on API routes, including error handling for missing or invalid keys. - Added a solution file to organize the project structure and included a strong name key file for assembly signing. * feat(dataspark): implement database tools, AI hardening, analytics, and CLI consolidation * docs(spec): add pull request review for DataSpark consolidation with compliance assessment and recommendations * fix(pr-9): address review findings for security, architecture, async, and logging * fix(pr-9): close remaining security and architecture review findings * docs(pr-review): update PR #9 follow-up review report * fix(pr-9): address remaining findings and enhance security measures in implementation plan * fix(pr-9): update review metadata and address async I/O discipline violations in DataSpark.Core services * fix(core,web,tests): resolve PR #9 review findings * fix(pr-9): update review metadata and improve assessment details in PR documentation * feat(core): implement database discovery summary service and export packaging service with logging
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Bumps js-yaml from 4.1.0 to 4.1.1.
Changelog
Sourced from js-yaml's changelog.
Commits
cc482e74.1.1 released50968b8dist rebuildd092d86lint fix383665ffix prototype pollution in merge (<<)0d3ca7aREADME.md: HTTP => HTTPS (#678)49baadddoc: 'empty' style option for !!nullba3460eFix demo link (#618)Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebasewill rebase this PR@dependabot recreatewill recreate this PR, overwriting any edits that have been made to it@dependabot mergewill merge this PR after your CI passes on it@dependabot squash and mergewill squash and merge this PR after your CI passes on it@dependabot cancel mergewill cancel a previously requested merge and block automerging@dependabot reopenwill reopen this PR if it is closed@dependabot closewill close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot show <dependency name> ignore conditionswill show all of the ignore conditions of the specified dependency@dependabot ignore this major versionwill close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor versionwill close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependencywill close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)You can disable automated security fix PRs for this repo from the Security Alerts page.