Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

declare routes with annotation #303

Closed
sebastien-ma opened this issue Jun 19, 2015 · 9 comments
Closed

declare routes with annotation #303

sebastien-ma opened this issue Jun 19, 2015 · 9 comments

Comments

@sebastien-ma
Copy link

I wonder why Spark does not support annotated route. I dug up some old issue two years ago, and it said it's "against" the paradigm Spark stands for.
I understand that. But still declarative route have its advantage:

  • can be attached with javadoc
  • easier to navigation in IDE
  • make route map more structured

The code would look like this.

/**
 * book api
 */
@Endpoint(path = "/book")
public class BookAPI {

    /**
     * get book by id
     */
    @Endpoint(path = "/:id", methods = HttpMethod.get)
    public static Object get(Request req, Response res) {

        return "hello";
    }

    public static void main(String... args) {

        Spark.scan(Arrays.asList("spark.examples"));

    }

}

Isn't it a good thing to make more options for users. They can choose the style they feel famillar with.

@jkwatson
Copy link
Contributor

If people want this, they should use Jersey. Please keep annotation-based routing out of Spark, at all costs. It's the reason I use it.

@chillenious
Copy link

If people want this, they should use Jersey. Please keep annotation-based
routing out of Spark, at all costs. It's the reason I use it.

+1

@zeroflag
Copy link
Contributor

Please no. There are plenty of annotations based web frameworks out there. Keep this one as it is.

@amarseillan-zz
Copy link
Contributor

If people want this, they should use Jersey. Please keep annotation-based routing out of Spark, at all costs. It's the reason I use it.

👍

@ribomation
Copy link

I agree with the comments above. Spark has its sweet spot at you know exactly (almost) was is going on. However, when that is said, I think it would be nice with some kind of plugin layer, which people could use to add extensions to Spark. Then having something like a "spark-contrib" area, for plugins.

@debarshri
Copy link

+1

But when you have too many routes, It kind of becomes a pain.
So few days back I wrote this app https://github.com/debarshri/spark-annotation
Crappy. but it works and is external to spark.

It looks something like this

@SparkUrl(
            path = "/",
            method = "GET"
)
public class SparkAnonTest implements Route {

    @Override
    public Object handle(Request request, Response response) {
        return "something";
    }
}

@borud
Copy link

borud commented Aug 18, 2015

I agree with the previous sentiments that annotations do not belong in Spark, but I figured I'd explain why I see the way Spark does routing as worthwhile.

I've been using JAX-RS containers for a while, and the trouble with annotations is that there is always some amount of guesswork involved. You throw classes with annotations out there and then you have no real control over how they get picked up and interpreted. Worse still, since there is a disconnect between where you add the annotation and where the annotated resource is added to the configuration/wiring, it gets hard to understand exactly what is happening and in what order. It is all magic.

When you explicitly configure the routes, adding resources to the configuration is explicit. You can point to a place in the code and say "here is where it gets added". You also get the opportunity to detect and report errors as you do this, since the framework can give you a compiler error or an exception that directs you towards the exact place where you had a problem.

Explicit configuration also makes it easier for new entrants to a project to follow what is happening. While it may be easy for experienced JAX-RS users to navigate smallish codebases, navigating large, complex codebases if you are inexperienced or rusty gets hairy quickly.

I actually think Spark could be made even better by moving further in the direction it is already moving. By ditching static instances, enforcing immutability and instead provide creature comforts by adding supporting code, like sensible builders and, to those who prefer the static model, a static facade (although I am conflicted by this since static instances are .... well...extremely problematic and you don't want people to do that)

@dalexander01
Copy link

+1 to everything said above

@jkwatson
Copy link
Contributor

+1 to @borud

@tipsy tipsy closed this as completed Nov 22, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants