Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tokenize(): produces inconsistent results #263

Closed
layershifter opened this issue May 17, 2021 · 4 comments
Closed

tokenize(): produces inconsistent results #263

layershifter opened this issue May 17, 2021 · 4 comments

Comments

@layershifter
Copy link

Hey folks!

I am working on a plugin for :global() selector and using tokenize() there. I am not sure that it's a bug, but results are inconsistent:

tokenize(':global(body) .foo') 
// => [ ':', 'global', '(body)', ' ', '.foo' ]
//            👆 selector produces a single string
tokenize(':global([data-popper-placement]) .foo') 
// => [ ':', 'global', '([data-popper-placement])', ' ', '.foo' ]
//                      👆 selector produces a single string
tokenize(":global([data-popper-placement^='top'])")
// => [ ':', 'global', "([data-popper-placement^='top'", ']', ')' ]
//                                                       👆 closing bracket and a paren are separate strings

I also tried this string with namespace plugin:

serialize(compile(`:global([data-popper-placement^="top"]) .foo { color: red }`), middleware([namespace, stringify]))
// => [data-popper-placement^="top]) .foo[object Object]{color:red;}
//                                ^ brackets are closed properly ✔
//                                  ^ but "[object Object]" is definitely an issue ❌

Can you please clarify if it's a bug in tokenizer?

@thysultan
Copy link
Owner

The last one looks like a bug, not sure about the others, will look into it.

@layershifter
Copy link
Author

@thysultan were you able to check if it is a bug? 😊

@thysultan
Copy link
Owner

The last one was a bug(fixed now on the latest patch), the rest you can manually recursively tokenize to get the desired depth of tokens.

1 similar comment
@thysultan
Copy link
Owner

The last one was a bug(fixed now on the latest patch), the rest you can manually recursively tokenize to get the desired depth of tokens.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants