Skip to content

A simple RESTful API that analyzes strings, computes properties (length, palindrome check, unique characters, word count, character frequency, SHA-256 hash), stores them in a local JSON DB, and provides filtering and natural-language-like queries.

Notifications You must be signed in to change notification settings

Yuneak/String-Analyzer-API

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

String Analyzer API

A simple RESTful API that analyzes strings, computes properties (length, palindrome check, unique characters, word count, character frequency, SHA-256 hash), stores them in a local JSON DB, and provides filtering and natural-language-like queries.

Tech

  • Node.js (>=14)
  • Express

Setup (local)

  1. Clone:
git clone https://github.com/YOUR-USERNAME/string-analyzer-api.git
cd string-analyzer-api

Install

npm install

Start

npm start

The server runs at http://localhost:3000 by default.

Endpoints

POST /strings

Analyze and store a string. Request body (JSON):

{ "value": "your string here" }

Responses:

  • 201 Created → returns stored object

  • 409 Conflict → string already exists

  • 400 Bad Request → missing "value"

  • 422 Unprocessable Entity → non-string value

GET /strings/:value

Retrieve the analyzed string by exact value (URL-encoded).

GET /strings

Get all strings (supports query filters):

  • is_palindrome (true/false)

  • min_length (int)

  • max_length (int)

  • word_count (int)

  • contains_character (single char)

Example: /strings?is_palindrome=true&min_length=3

GET /strings/filter-by-natural-language?query=...

Simple natural language filter parser. Example: /strings/filter-by-natural-language?query=all%20single%20word%20palindromic%20strings

DELETE /strings/:value

Delete a stored string (204 No Content on success).

Data Storage

Local JSON file: data/strings.json. For production, swap to a real DB.

Deploy

Railway or Heroku supported. No environment vars required for local test. If deploying, point to your repo and set PORT if needed.

Examples (curl)

Create:

curl -X POST http://localhost:3000/strings -H "Content-Type: application/json" -d '{"value":"racecar"}'

Get all:

curl "http://localhost:3000/strings?is_palindrome=true"

Natural language:

curl "http://localhost:3000/strings/filter-by-natural-language?query=all%20single%20word%20palindromic%20strings"


How this satisfies the task

  • Computes and stores all requested properties.
  • POST /strings includes 201, 409, 400, 422 responses.
  • GET /strings/:value, GET /strings with query filters, DELETE /strings/:value implemented.
  • GET /strings/filter-by-natural-language implemented with a minimal parser.
  • Uses semantic response codes and JSON.
  • Data stored in data/strings.json for simplicity; easy to swap for a DB.

Final notes & tips

  • The nlpParser is intentionally lightweight—it's rule-based to handle the example queries. If you need more complex parsing, I can integrate compromise or another NLP library, but that adds dependencies.
  • For production: replace JSON file storage with a database (Postgres, MongoDB). For concurrency safety on multi-instance deployments, a DB is required.
  • For hosting on Railway: create a new project, link GitHub repo, Railway will auto-deploy. No environment variables needed unless you change port/storage.

About

A simple RESTful API that analyzes strings, computes properties (length, palindrome check, unique characters, word count, character frequency, SHA-256 hash), stores them in a local JSON DB, and provides filtering and natural-language-like queries.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published