Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong insertion into followers feed? #16

Closed
andrenro opened this issue Dec 18, 2013 · 8 comments
Closed

Wrong insertion into followers feed? #16

andrenro opened this issue Dec 18, 2013 · 8 comments

Comments

@andrenro
Copy link

Hi!
I manage to get the get_user_feed() work properly, but when I try to fetch followers feeds, something strange happens. I think it might be an issue connected to fanout/insertion into feedly: Ive used the pin_feedly example as a starting point.

Here is some code:
models.py:

Test implementation of feedly:

class WishFeed(RedisFeed):
key_format = 'feed:normal:%(user_id)s'

class UserWishFeed(WishFeed):
key_format = 'feed:user:%(user_id)s'

class UserWishFeedly(Feedly):
feed_classes = dict(
normal=WishFeed,
)

user_feed_class = UserWishFeed

def add_wish(self,user,activity):
    # add user activity adds it to the user feed, and starts the fanout
    self.add_user_activity(user.id, activity)

def get_user_follower_ids(self, user_id):
    from email_auth.models import GiftItEndUser
    friends = Friend.objects.filter(Q(from_user=user_id) | Q(to_user=user_id))
    f_ids = []
    if friends.exists():
        for f in friends:
            if f.to_user.pk != user_id:
                f_ids.append(f.to_user.pk)
            if f.from_user.pk != user_id:
                f_ids.append(f.from_user.pk)
    ids = GiftItEndUser.objects.filter(id__in=f_ids).values_list('id',flat=True)
    return {FanoutPriority.HIGH:ids}

views.py
@csrf_exempt
@api_view(['GET'])
@login_required
def friends_wish_feed(request, _args, *_kwargs):
feed = feedly.get_feeds(request.user.id)['normal']
act_list = []
json_data = []
for acts in feed[:25]:
act_list.append(acts)
json_data = json.dumps(act_list)
return Response(json_data)

this is how I insert into the feeds:
views.py
...
activity = Activity(wishlist.user,WishVerb,in_prod.id)
feed = UserWishFeed(wishlist.user.id)
feed.add(activity)
feedly.add_wish(wishlist.user, activity)
...

from redis-cli after insertion i will get this:
127.0.0.1:6379> keys *

  1. "global:3"
  2. "global:4"
  3. "feed:user:7"
  4. "global:7"
  5. "global:9"

if i query 3) :
127.0.0.1:6379> zrange "feed:user:7" 0 1

  1. "13873587370000000000003005"
  2. "13873587410000000000001005"

127.0.0.1:6379> hgetall "global:3"

  1. "13873587470000000000006005"
  2. "7,5,6,0,1387358747,"

Seems like the insertion went fine, except that it did not define the correct key_format?
How can I fix this?

Why does it say "global:" instead of "user:normal:" like I defined in WishFeed(RedisFeed) ?

The function friends_wish_feed() returns [] every time..

Help appreciated!! :)

@tschellenbach
Copy link
Owner

Hi !

The global name is used for the storing the activity. (mapping the id to the activity)
(For which we should probably really find a way to make it smaller)
(@tbarbugli, ideas on this?)

So its normal for data to end up in global.
From your setup i would expect:
1 in feed:user
1 in global
number of followers in the feed:normal

The feedly setup looks ok. I see two possible reasons why there is nothing in feed:normal

A.) the list of follower ids returned is empty
B.) the celery tasks generated by create fanout tasks don't get processed

(You could try enabling
CELERY_ALWAYS_EAGER = True
CELERY_EAGER_PROPAGATES_EXCEPTIONS = True)
Dont use these settings in production though!

What sort of application are you building?

Best of luck!

@andrenro
Copy link
Author

Hi!:)

Im building a social gift-planner/wishlist-app! Im trying to integrate feedly so that I can make REST-based feeds on the form "User1 added product1 to wishlist1" :) So what i would need is that other users (User1's friends) can receive this kind of feed.
I would guess possibility B) , but I've tried with those settings, nothing happened..Any other possible solutions?

Current settings.py:
FEEDLY_NYDUS_CONFIG = {
'CONNECTIONS': {
'redis': {
'engine': 'nydus.db.backends.redis.Redis',
'router': 'nydus.db.routers.redis.PrefixPartitionRouter',
'hosts': {
0: {'prefix': 'default', 'db': 0, 'host': '127.0.0.1', 'port': 6379},
}
},
}
}

BROKER_URL = 'redis://127.0.0.1:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
CELERY_ALWAYS_EAGER = True
CELERY_EAGER_PROPAGATES_EXCEPTIONS = True

I will check get_user_follower_ids() for emptiness (I am pretty certain that the friend-relationships exists in the database though). I would need something that works in production also :)

@tschellenbach
Copy link
Owner

Sounds cool,

Celery is a great for running async tasks, but it can be a bit of a learning curve.
The docs are here:
http://www.celeryproject.org/

Try to verify that the code reaches
UserWishFeedly.fanout
Could you dump the local variables you get when going into UserWishFeedly.fanout?

@andrenro
Copy link
Author

I think there might be a problem with my redis setup as well, I've
installed rabbitMQ and Redis but it seems that the settings for
redis/celery does nothing . If i stop the rabbitMQ (rabbitmqctl stop) i
will get an "error [111] connection refused" when I try to add something to
the feeds. My impression was that Redis was now my backend, and that I do
not need to have rabbitMQ server running? Im confused.. Celery uses EITHER
rabbitMQ or Redis as its task broker, right?

On Wed, Dec 18, 2013 at 12:33 PM, Thierry Schellenbach <
notifications@github.com> wrote:

Sounds cool,

Celery is a great for running async tasks, but it can be a bit of a
learning curve.
The docs are here:
http://www.celeryproject.org/

Try to verify that the code reaches
UserWishFeedly.fanout
Could you dump the local variables you get when going into
UserWishFeedly.fanout?


Reply to this email directly or view it on GitHubhttps://github.com//issues/16#issuecomment-30834733
.

Mvh Andreas Røed

@tbarbugli
Copy link
Collaborator

when CELERY_ALWAYS_EAGER is set to True tasks are executed in the same
thread and celery does not connect to the task broker.
I think that your celery settings are ignored / unused; you can find docs
about this here
https://docs.celeryproject.org/en/latest/django/first-steps-with-django.html

2013/12/18 Andreas Røed notifications@github.com

I think there might be a problem with my redis setup as well, I've
installed rabbitMQ and Redis but it seems that the settings for
redis/celery does nothing . If i stop the rabbitMQ (rabbitmqctl stop) i
will get an "error [111] connection refused" when I try to add something
to
the feeds. My impression was that Redis was now my backend, and that I do
not need to have rabbitMQ server running? Im confused.. Celery uses EITHER
rabbitMQ or Redis as its task broker, right?

On Wed, Dec 18, 2013 at 12:33 PM, Thierry Schellenbach <
notifications@github.com> wrote:

Sounds cool,

Celery is a great for running async tasks, but it can be a bit of a
learning curve.
The docs are here:
http://www.celeryproject.org/

Try to verify that the code reaches
UserWishFeedly.fanout
Could you dump the local variables you get when going into
UserWishFeedly.fanout?


Reply to this email directly or view it on GitHub<
https://github.com/tschellenbach/Feedly/issues/16#issuecomment-30834733>
.

Mvh Andreas Røed


Reply to this email directly or view it on GitHubhttps://github.com//issues/16#issuecomment-30837484
.

@andrenro
Copy link
Author

So I got it working correctly when using CELERY_ALWAYS_EAGER = True ! :)
Only thing left is to make the broker settings work correctly! I don't get
any errors when CELERY_ALWAYS_EAGER = False, it just doesn't fetch the
newest feeds properly..

On Wed, Dec 18, 2013 at 1:49 PM, Tommaso Barbugli
notifications@github.comwrote:

when CELERY_ALWAYS_EAGER is set to True tasks are executed in the same
thread and celery does not connect to the task broker.
I think that your celery settings are ignored / unused; you can find docs
about this here

https://docs.celeryproject.org/en/latest/django/first-steps-with-django.html

2013/12/18 Andreas Røed notifications@github.com

I think there might be a problem with my redis setup as well, I've
installed rabbitMQ and Redis but it seems that the settings for
redis/celery does nothing . If i stop the rabbitMQ (rabbitmqctl stop) i
will get an "error [111] connection refused" when I try to add something
to
the feeds. My impression was that Redis was now my backend, and that I
do
not need to have rabbitMQ server running? Im confused.. Celery uses
EITHER
rabbitMQ or Redis as its task broker, right?

On Wed, Dec 18, 2013 at 12:33 PM, Thierry Schellenbach <
notifications@github.com> wrote:

Sounds cool,

Celery is a great for running async tasks, but it can be a bit of a
learning curve.
The docs are here:
http://www.celeryproject.org/

Try to verify that the code reaches
UserWishFeedly.fanout
Could you dump the local variables you get when going into
UserWishFeedly.fanout?


Reply to this email directly or view it on GitHub<
https://github.com/tschellenbach/Feedly/issues/16#issuecomment-30834733>

.

Mvh Andreas Røed


Reply to this email directly or view it on GitHub<
https://github.com/tschellenbach/Feedly/issues/16#issuecomment-30837484>
.


Reply to this email directly or view it on GitHubhttps://github.com//issues/16#issuecomment-30838903
.

Mvh Andreas Røed

@tbarbugli
Copy link
Collaborator

Thats because tasks are waiting in the broker to be consumed by a celery
worker.
You need to start a celery worker for that ;)
On 18 Dec 2013 14:24, "Andreas Røed" notifications@github.com wrote:

So I got it working correctly when using CELERY_ALWAYS_EAGER = True ! :)
Only thing left is to make the broker settings work correctly! I don't get
any errors when CELERY_ALWAYS_EAGER = False, it just doesn't fetch the
newest feeds properly..

On Wed, Dec 18, 2013 at 1:49 PM, Tommaso Barbugli
notifications@github.comwrote:

when CELERY_ALWAYS_EAGER is set to True tasks are executed in the same
thread and celery does not connect to the task broker.
I think that your celery settings are ignored / unused; you can find
docs
about this here

https://docs.celeryproject.org/en/latest/django/first-steps-with-django.html

2013/12/18 Andreas Røed notifications@github.com

I think there might be a problem with my redis setup as well, I've
installed rabbitMQ and Redis but it seems that the settings for
redis/celery does nothing . If i stop the rabbitMQ (rabbitmqctl stop)
i
will get an "error [111] connection refused" when I try to add
something
to
the feeds. My impression was that Redis was now my backend, and that I
do
not need to have rabbitMQ server running? Im confused.. Celery uses
EITHER
rabbitMQ or Redis as its task broker, right?

On Wed, Dec 18, 2013 at 12:33 PM, Thierry Schellenbach <
notifications@github.com> wrote:

Sounds cool,

Celery is a great for running async tasks, but it can be a bit of a
learning curve.
The docs are here:
http://www.celeryproject.org/

Try to verify that the code reaches
UserWishFeedly.fanout
Could you dump the local variables you get when going into
UserWishFeedly.fanout?


Reply to this email directly or view it on GitHub<

https://github.com/tschellenbach/Feedly/issues/16#issuecomment-30834733>

.

Mvh Andreas Røed


Reply to this email directly or view it on GitHub<
https://github.com/tschellenbach/Feedly/issues/16#issuecomment-30837484>

.


Reply to this email directly or view it on GitHub<
https://github.com/tschellenbach/Feedly/issues/16#issuecomment-30838903>
.

Mvh Andreas Røed


Reply to this email directly or view it on GitHubhttps://github.com//issues/16#issuecomment-30840972
.

@thedrow
Copy link

thedrow commented Dec 22, 2013

So this should be closed?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants