Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes #109 by adding SCALE ops for IN and PARAM #124

Closed
wants to merge 5 commits into from

Conversation

@burnsauce
Copy link
Contributor

@burnsauce burnsauce commented Oct 25, 2017

What does this PR do?

Adds two new operators, IN.SCALE and PARAM.SCALE

Syntax: IN.SCALE -100 100

All future calls to IN will scale linearly to between -100 and 100.

Provide links to any related discussion on lines.

https://llllllll.co/t/new-teletype-operators-and-features/9076

How should this be manually tested?

Set various PARAM.SCALE values and turn the knob around.

Any background context you want to provide?

Lays the groundwork for CAL calibration operators.

I have,

  • updated CHANGELOG.md
  • updated the documentation
Except in pattern / tracker mode, which would require a refactor.
@burnsauce
Copy link
Contributor Author

@burnsauce burnsauce commented Oct 25, 2017

I did not solicit the community for a name for this operator. It's possible people would prefer MAP.

@cmcavoy
Copy link
Contributor

@cmcavoy cmcavoy commented Oct 27, 2017

There's already precedent for scale meaning scaling a value based on a range in the SCALE op. I'm for renaming SCALE to MAP but that would be a breaking change.

burnsauce added 4 commits Oct 23, 2017
Syntax: IN.SCALE -100 100

All future calls to IN will scale linearly to between -100 and 100.
@burnsauce burnsauce force-pushed the burnsauce:scale branch from a020b3b to 95e2954 Oct 28, 2017
@burnsauce burnsauce closed this Oct 28, 2017
@burnsauce burnsauce deleted the burnsauce:scale branch Oct 30, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked issues

Successfully merging this pull request may close these issues.

None yet

2 participants