Skip to content

kennethchoe/sql-crawler

Repository files navigation

sql-crawler

Connect to multiple SQL database servers and run queries to collect data.

When Do You Need sql-crawler?

  1. Your application is single-tenant database.
  2. You need to poll anonymous information from multiple single-tenant production sql servers.
  3. Credential to production sql servers is managed by different department.
  4. Your polling sql statement needs to go through approval process.

Demo

https://agilesalt.net/sql-crawler

Configuration

Configure your own sql server list and sql queries git repository. Then specify them on appsettings.json, which are overridable with environment variables.

Sql Server List

Comma-delimited file. *: required

  1. ServerId*: unique identifier of the server.
  2. ServerName: user-friendly name of the server.
  3. Scope: see Scope.
  4. Description.
  5. UserData1, UserData2: custom info that you can use in your sql as parameter (@UserData1) or Handlebars replacement ({{UserData1}}). For other properties that you can use in your sql, see SqlServerInfoPublic.
  6. ServerDriver: mssql (default) or sqlite. To support more, add class on Drivers.
  7. DataSource*, UseIntegratedSecurity, SqlUsername, SqlPassword: connection info consumed by ServerDrivers. See implementation of each driver to find more details.

View Sample

Sql Queries Git Repo Url

Files ending with .sql are recognized as queries. If approval process is needed, you may configure your git repo with permission.

View Sample

Scope

SQL queries can be placed inside subfolders in the git repository. Then the path of query file becomes the scope of the query.

Server List may have optional column Scope. ServerId must be still unique across entire list.

When you run query, it runs against servers with matching scope or below. For example, a/b/test.sql's scope is a/b. It will run on the server with its Scope values a/b or a/b/c, but not on a nor a server with empty Scope value.

For more details, check out the demo site.

ServerId must be still unique across different scopes. Same rule is applied to sql query file name, so if you have the same file name in different paths, sql-crawler will raise an error.

Commands

  • n.bat serve : launch vue on dev mode. Vue code is hot-reloaded.
    • You must launch back-end from Visual Studio also as IIS Express profile.
  • run-webapp.ps1 : launch webapp with vue code transpiled in it, on http:5002 and https:5003.
  • build.ps1 -target publish : create web package that can be used to run under IIS
  • build.ps1 -target build-docker : create web package on a linux docker image called sql-crawler

Playing with Docker Image

  • Running on linux container: docker run -p 5004:80 kennethchoe/sql-crawler

    • If you want to have your own sql-credentials attached, attach the folder to app/data. Refer to docker-test-data.ps1 for an example.
  • docker-run.ps1 : build, create linux docker image and run it as http:5004.

    • -port 1234 : run it on http:1234.
    • -gitSqlSource https://git-repo-url : specify sql source. Default value is https://github.com/kennethchoe/sql-crawler-sqls.git.
    • -gitUsername username -gitPassword password : specify git repo's credential, if needed.
    • -skipRebuild skips rebuilding docker image.

Useful Docker Commands

Because I forget them after a while...

  • docker ps : display containers currently running
  • docker container prune : clean up all stopped containers
  • docker logs -f container-identifier : show console log until you ctrl+C.
  • docker inspect container-identifier : show full info about the container, including port, volume mapping, etc.
  • docker exec -it container-identifier bash : launch interactive bash on the container. exit to disconnect
  • docker exec container-identifier ls /app : tap into the container, display the result of ls /app and then disconnect immediately

About

Connect to multiple SQL database servers and run queries to collect data.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published