Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mongo DB #129

Closed
AmandaDLyon opened this issue Jul 3, 2020 · 19 comments
Closed

Mongo DB #129

AmandaDLyon opened this issue Jul 3, 2020 · 19 comments
Labels
discussion Discuss if certain features are needed

Comments

@AmandaDLyon
Copy link

Hey guys, I know that the first way to get the platform started is by using "docker-compose up" however what I have been trying to do is start up the individual code bases.

Now after a lot of attempts, I realized that when you start the whole stack through docker, it creates a local Mongo instance on port 27000 which is otherwise not accessible.

So if I want to use Mongo Atlas or say run the platform without docker (by individually running npm serve or np run devstart, how would I do that?

Cheers!

@smaharj1
Copy link
Collaborator

smaharj1 commented Jul 3, 2020

You can actually access your Docker mongo instance by locally accessing port 27000. However, if you want to use your own Mongo Atlas, you just need to change the Mongo URL in the environment file. Please refer to this document for more info on how to do so.

@AmandaDLyon
Copy link
Author

@smaharj1 So that I am aware of. Now here's what's happening. I see the collections have been created on my Atlas however those collections are empty. However, only after I have initialized the Veniqa Docker Container (docker-compose up), I am able to access Mongo on port 27000 and see the collections with documents in them.

If I do not start the platform using docker, I get a Winston JS error stating that I need to specify a DB to connect to. It's quite weird really...

All in all, I am as of now running Veniqa on my local system with the Mongo URL set to my Atlas URL. However, the collections in Atlas have no documents in them but the Mongo instance running on port 27000 (on my local machine) has the necessary documents (the ones found in the data dump folder of this repo). Please refer to the screenshot.

This is what I see on my Mongo Atlas

Screenshot 2020-07-03 at 8 42 25 PM

@smaharj1 smaharj1 added the discussion Discuss if certain features are needed label Jul 3, 2020
@smaharj1
Copy link
Collaborator

smaharj1 commented Jul 3, 2020

The winston error is the desired behavior. If you want to directly connect to Atlas, please change/add the URL of Atlas in .env file in your server project. That way you don't have to run docker at all. Docker is not essential for running the application. We created it later to consolidate all the apps in one run. You can run everything separately by just switching to .env files in your individual projects.

@AmandaDLyon
Copy link
Author

So I have populated the .env file in management-webclient and management-server with the necessary credentials and I have renamed the file to .env.development.

Post these steps, I am receiving the Winston error.

What did you mean by "switching to .env files in your individual projects." Have I missed something?

@smaharj1
Copy link
Collaborator

smaharj1 commented Jul 3, 2020

Can you rename it to .env (without .development) and try? No. You didn't miss anything

@AmandaDLyon
Copy link
Author

I have now changed the file name to just .env and I now see the following in my terminal: -

Screenshot 2020-07-04 at 2 06 34 PM

However localhost:4202 cannot be connected to. It seems like the API is offline (it's a guess since if I go to localhost:4202 it says that my browser cannot find the page). Below are the console logs:

image

@AmandaDLyon
Copy link
Author

AmandaDLyon commented Jul 4, 2020

I am as of now running Veniqa on my local system with the Mongo URL set to my Atlas URL. However, the collections in Atlas have no documents in them but the Mongo instance running on port 27000 (on my local machine) has the necessary documents (the ones found in the data dump folder of this repo). Please refer to the screenshot.

This is what I see on my Mongo Atlas

Screenshot 2020-07-03 at 8 42 25 PM

This issue of mine still stands. I am weirded out by how none of this problems arise when the whole platform is set up using docker.

@smaharj1
Copy link
Collaborator

smaharj1 commented Jul 4, 2020

I see that in /bin/www, the port number is 3000. Please call localhost:3000 to get successful calls. The port is only 4202 if you are doing through Docker. If you want to change the port locally, you can either pass an environment variable PORT in .env file or change the default port in /bin/www.

@AmandaDLyon
Copy link
Author

I changed the AWS info in the .env file to my account's creds and also updated the config/default.json to match the location.

Along with that I went into my AWS S3 account and made the bucket public to see if that is causing the issue. Doing so also made no difference. What should I be doing here?

image

@AmandaDLyon
Copy link
Author

I see that in /bin/www, the port number is 3000. Please call localhost:3000 to get successful calls. The port is only 4202 if you are doing through Docker. If you want to change the port locally, you can either pass an environment variable PORT in .env file or change the default port in /bin/www.

I made this change and tried accessing localhost:3000 through my browser and it is stuck in an endless "loading" state.

@AmandaDLyon
Copy link
Author

I have been trying like crazy and I keep getting Error Code 403 Forbidden when I attempt to upload images to my S3. It seems to be caused because of some CORS error.

@Viveckh maybe you could help me out here since it is a backend task.

This is one of the errors :

Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at... (url to the bucket)

and then I see a bunch of Reason: CORS request did not succeed) and Reason: CORS header ‘Access-Control-Allow-Origin’ missing

I even went ahead and wrote up a CORS config:

<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
    <CORSRule>
        <AllowedOrigin>*</AllowedOrigin>
        <AllowedMethod>GET</AllowedMethod>
        <AllowedMethod>HEAD</AllowedMethod>
        <AllowedHeader>*</AllowedHeader>
    </CORSRule>
</CORSConfiguration>

What have you guys done to get this to work?

@smaharj1
Copy link
Collaborator

smaharj1 commented Jul 7, 2020

I have been trying like crazy and I keep getting Error Code 403 Forbidden when I attempt to upload images to my S3. It seems to be caused because of some CORS error.

@Viveckh maybe you could help me out here since it is a backend task.

This is one of the errors :

Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at... (url to the bucket)

and then I see a bunch of Reason: CORS request did not succeed) and Reason: CORS header ‘Access-Control-Allow-Origin’ missing

I even went ahead and wrote up a CORS config:

<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
    <CORSRule>
        <AllowedOrigin>*</AllowedOrigin>
        <AllowedMethod>GET</AllowedMethod>
        <AllowedMethod>HEAD</AllowedMethod>
        <AllowedHeader>*</AllowedHeader>
    </CORSRule>
</CORSConfiguration>

What have you guys done to get this to work?

I think the problem here is the bucket policy. As you can see from your code, only GET method is allowed. Maybe this can help you set the bucket policy.

@AmandaDLyon
Copy link
Author

@smaharj1 I finally got S3 to work. Below is the Bucket Policy I set:

{
    "Version": "2012-10-17",
    "Id": "Policy1594102782490",
    "Statement": [
        {
            "Sid": "Stmt15941012824",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:*",
            "Resource": "arn:aws:s3:::name-of-bucket"
        }
    ]
}

and this is the CORS config that I used:

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
    <AllowedOrigin>http://localhost:5202</AllowedOrigin>
    <AllowedMethod>PUT</AllowedMethod>
    <AllowedMethod>POST</AllowedMethod>
    <AllowedMethod>DELETE</AllowedMethod>
    <MaxAgeSeconds>3000</MaxAgeSeconds>
    <ExposeHeader>x-amz-server-side-encryption</ExposeHeader>
    <ExposeHeader>x-amz-request-id</ExposeHeader>
    <ExposeHeader>x-amz-id-2</ExposeHeader>
    <AllowedHeader>*</AllowedHeader>
</CORSRule>
</CORSConfiguration>

@AmandaDLyon
Copy link
Author

Now that I am able to upload images to my S3, upon saving the new product, the management-server logs on my terminal show me the following error:

Screenshot 2020-07-07 at 2 53 37 PM

Not sure what's causing this.

Also, in my S3, I am seeing all the images that I had uploaded in the "thumbnail" folder. Is that supposed to happen? I understand that all of them need to be present in the "detailed images" folder but shouldn't only the image that is set as the thumbnail be a part of the "thumbnails" folder?

@smaharj1
Copy link
Collaborator

smaharj1 commented Jul 7, 2020

@AmandaDLyon what do you get in the API response for this? Specifically trying to look at errorDetails in response.

@AmandaDLyon
Copy link
Author

AmandaDLyon commented Jul 7, 2020

@smaharj1 Its a Code 500 (Internal Server Error).

@AmandaDLyon
Copy link
Author

Its a Code 500 (Internal Server Error).

Screenshot 2020-07-07 at 8 26 12 PM

@AmandaDLyon
Copy link
Author

@smaharj1 I re-installed the whole platform after running docker system prune -a and then running docker-compose up thinking maybe this might fix the issue, however that did not help.

Could you attempt to add or edit an existing problem to check if the issue persists on your end as well. Ideally it shouldn't. I am wondering if this is a bad config on my local machine. Maybe permissions issue..?

@AmandaDLyon
Copy link
Author

@smaharj1 I feel like this issue needs to be separated from this discussion thread as the topics aren't the same. Please refer to and provide a solution at #132.

Thanks buddy!

@smaharj1 smaharj1 closed this as completed Jul 9, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
discussion Discuss if certain features are needed
Projects
None yet
Development

No branches or pull requests

2 participants