Skip to content

Commit

Permalink
Finish the job of moving shapes into the stream
Browse files Browse the repository at this point in the history
This commit should finish the `coloring_in_tokens` feature, which moves
the shape accumulator into the token stream. This allows rollbacks of
the token stream to also roll back any shapes that were added.

This commit also adds a much nicer syntax highlighter trace, which shows
all of the paths the highlighter took to arrive at a particular coloring
output. This change is fairly substantial, but really improves the
understandability of the flow. I intend to update the normal parser with
a similar tracing view.

In general, this change also fleshes out the concept of "atomic" token
stream operations.

A good next step would be to try to make the parser more
error-correcting, using the coloring infrastructure. A follow-up step
would involve merging the parser and highlighter shapes themselves.
  • Loading branch information
wycats committed Oct 22, 2019
1 parent 82b24d9 commit 66d0180
Show file tree
Hide file tree
Showing 22 changed files with 886 additions and 356 deletions.
4 changes: 2 additions & 2 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ heim = {version = "0.0.8", optional = true }
battery = {version = "0.7.4", optional = true }
rawkey = {version = "0.1.2", optional = true }
clipboard = {version = "0.5", optional = true }
ptree = {version = "0.2", optional = true }
ptree = {version = "0.2" }
image = { version = "0.22.2", default_features = false, features = ["png_codec", "jpeg"], optional = true }

[features]
Expand All @@ -95,7 +95,7 @@ binaryview = ["image", "crossterm"]
sys = ["heim", "battery"]
ps = ["heim"]
# trace = ["nom-tracable/trace"]
all = ["raw-key", "textview", "binaryview", "sys", "ps", "clipboard", "ptree"]
all = ["raw-key", "textview", "binaryview", "sys", "ps", "clipboard"]

[dependencies.rusqlite]
version = "0.20.0"
Expand Down
1 change: 0 additions & 1 deletion src/fuzzysearch.rs
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,6 @@ pub fn interactive_fuzzy_search(lines: &Vec<&str>, max_results: usize) -> Select
selected = 0;
}
_ => {
// println!("OTHER InputEvent: {:?}", k);
}
},
_ => {}
Expand Down
3 changes: 0 additions & 3 deletions src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,6 @@ use log::LevelFilter;
use std::error::Error;

fn main() -> Result<(), Box<dyn Error>> {
#[cfg(feature1)]
println!("feature1 is enabled");

let matches = App::new("nushell")
.version(clap::crate_version!())
.arg(
Expand Down
1 change: 0 additions & 1 deletion src/parser.rs
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@ pub(crate) use parse::files::Files;
pub(crate) use parse::flag::{Flag, FlagKind};
pub(crate) use parse::operator::Operator;
pub(crate) use parse::parser::{nom_input, pipeline};
pub(crate) use parse::pipeline::{Pipeline, PipelineElement};
pub(crate) use parse::text::Text;
pub(crate) use parse::token_tree::{DelimitedNode, Delimiter, TokenNode};
pub(crate) use parse::tokens::{RawNumber, RawToken};
Expand Down
10 changes: 9 additions & 1 deletion src/parser/hir/expand_external_tokens.rs
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,10 @@ impl ColorSyntax for ExternalTokensShape {
type Info = ();
type Input = ();

fn name(&self) -> &'static str {
"ExternalTokensShape"
}

fn color_syntax<'a, 'b>(
&self,
_input: &(),
Expand Down Expand Up @@ -192,6 +196,10 @@ impl ColorSyntax for ExternalExpression {
type Info = ExternalExpressionResult;
type Input = ();

fn name(&self) -> &'static str {
"ExternalExpression"
}

fn color_syntax<'a, 'b>(
&self,
_input: &(),
Expand All @@ -212,7 +220,7 @@ impl ColorSyntax for ExternalExpression {
Ok(atom) => atom,
};

atom.color_tokens(token_nodes.mut_shapes());
token_nodes.mutate_shapes(|shapes| atom.color_tokens(shapes));
return ExternalExpressionResult::Processed;
}
}

0 comments on commit 66d0180

Please sign in to comment.