Episode

  Рет қаралды 7,015

Drifting Ruby

Drifting Ruby

Күн бұрын

Пікірлер: 10
@vadimderyabin8617
@vadimderyabin8617 7 жыл бұрын
in the sidekiq.rb initializer .. config.redis = {url: ENV['REDIS_PROVIDER']} .. would be wrong as long as ENV['REDIS_PROVIDER'] = 'REDISTOGO_URL' as we see in your example. The correct one would be .. config.redis = {url: ENV[ENV['REDIS_PROVIDER']]} ..
@zulhilmizainudin
@zulhilmizainudin 8 жыл бұрын
I've questions: 1. How many concurrent connection the redis-server can handle at one time if I'm using my own server (not Heroku)? Currently, I'm using Heroku. All Redis add-ons on Heroku are priced at different concurrent connection offer. 2. Assuming the answer for #1 is unlimited, what's the best number that we should put inside config/sidekiq.yml for production concurrency? 3. What's the best practice for architecture Rails Sidekiq app? Should I install and run redis-server and sidekiq process on same server with my Rails app or make it like this? - Rails app on server 1 - Sidekiq process on server 2 - redis-server on server 3 4. What will happen when redis-server is down? Does the job saved in the redis-server database? 5. Does the redis-server and sidekiq process down everytime we deploy new release of our Rails app? If YES and the redis-server database is not persisted, how to overcome the problem?
@DriftingRuby
@DriftingRuby 8 жыл бұрын
In Redis 2.6, the default max connections is 10,000. However, this can be modified in the redis.conf with the maxclients setting.
@zulhilmizainudin
@zulhilmizainudin 8 жыл бұрын
Thanks. How about other questions?
@DriftingRuby
@DriftingRuby 8 жыл бұрын
Sorry, Didn't see the additional questions. I usually prefer any type of Q&A on www.driftingruby.com/episodes/sidekiq-on-production comment section as it provides better text layout. 2. The default concurrency is 25, it is recommended to not exceed 50 for this value. 3. I definitely split my app and redis-server to separate boxes. However, unless your app is growing to over 1000 rps, having the sidekiq process on the same server isn't a big deal. It will scale as you add additional servers behind a Load Balancer. If your app is much heavier on background jobs than web requests, You can extract out the sidekiq service and have a dedicated pool of servers to handle these background requests. 4. If the redis server goes down, sidekiq will not be able to access the server to allow write the jobs. the application will likely crash. However, you are able to set a network_timeout (defaulting) to 1 second in your config that will allow for latency. 5. Redis should not need to go down whenever there is an update to your application. However, sidekiq will need to be reloaded to pickup the new codebase. Regardless, I think the question is more, what happens when Redis runs out of memory. For a sidekiq redis server, you should have the redis.conf set to persist the keys/values so that the data is periodically written to disk and can be reloaded.
@zulhilmizainudin
@zulhilmizainudin 8 жыл бұрын
Thank you for your answer! More questions: 1. That means, if I put the sidekiq service on different boxes, I need to make sure these boxes (sidekiq services) can talk to the Rails app database that sit together with my Rails app, right? 2. And, everytime I do deployment to my Rails app, I need to deploy it to my sidekiq boxes too, right? 3. What do you mean by set a network_timeout (defaulting) to 1 second? I didn't get it yet. 4. Regarding the redis.conf set to persist the keys/values, is not turned on by default? Do you have any reference? Thank you!
@DriftingRuby
@DriftingRuby 8 жыл бұрын
1. You would still deploy your application as you normally would, except for provisioning the web/app services. You would still need to create your database connections and secrets on the sidekiq service. 2. That is correct. 3. Since Redis is a separate entity from sidekiq or your Rails application, there can be a latency between the services talking with each other for whatever reason. It is typically extremely quick, but there are other environmental issues that could arise which imposes an increased latency. For example, if you are communicating with a redis service over the internet instead of a local network connection. There are several hops the request would have to make and a high latency in an internet backbone could delay the communication. 4. Have a look at redis.io/topics/persistence Typically, a redis store will be in memory. If that server/service is rebooted then the memory is purged and any keys/values would be lost. Saving the data periodically to disk will allow the redis server to reread the data instead of expiring the oldest key/value to make room for more memory.
@gokulps5433
@gokulps5433 5 жыл бұрын
Is it possible to view the processed jobs in sidekiq?
@sye119
@sye119 5 жыл бұрын
yes u can view it by accessing the "/sidekiq", for example, www.example.com/sidekiq, here is the setup process github.com/mperham/sidekiq/wiki/Monitoring
Background Processing with Rails, Redis and Sidekiq
15:00
Decypher Media
Рет қаралды 62 М.
Episode #059 - Background Jobs with Sidekiq
7:40
Drifting Ruby
Рет қаралды 28 М.
Каха и дочка
00:28
К-Media
Рет қаралды 3,4 МЛН
How to treat Acne💉
00:31
ISSEI / いっせい
Рет қаралды 108 МЛН
Сестра обхитрила!
00:17
Victoria Portfolio
Рет қаралды 958 М.
Deploy Ruby on Rails to Render
9:14
Leabs
Рет қаралды 4,8 М.
Advanced Background Jobs in Rails: Batching
13:14
GoRails
Рет қаралды 10 М.
Redis and Sidekiq on Digital Ocean App Platform
8:37
MichielSikkes
Рет қаралды 2,1 М.
NGINX Explained - What is Nginx
14:32
TechWorld with Nana
Рет қаралды 296 М.
Docker Для Начинающих за 1 Час | Docker с Нуля
52:43
Episode #056 - Redis
5:48
Drifting Ruby
Рет қаралды 15 М.
Каха и дочка
00:28
К-Media
Рет қаралды 3,4 МЛН