Import to Pressensor

A couple of weeks ago, gillesbeesley opened an interesting issue on GitHub basically asking this:

As a Pressensor user with a Flair 58 and modified Breville dual boiler, I’d like to download my shot data in CSV format and upload it to the Pressensor app for reference when pressure and flow profiling on my machines.

Since I’ve never used Pressensor, it took some back and forth, to figure out the exact format. Eventually I found a way to add the export functionality, so now, you can easily export any shot as a CSV file by clicking the table-shaped button on any shot page.

Updates RSS

Since I support RSS on people’s pages, there’s absolutely no reason there shouldn’t be a feed for Updates as well. So now there is: https://visualizer.coffee/updates/feed.

Misc

As usual, there were many other minor improvements, most notably switching the JavaScript CDN provider from Skypack to jsDelivr. I didn’t have any issues, but Skypack seems abandoned, and importmap-rails recenty switched logic to always download. I decided to use the new approach wherever possible, and, since I was already working with importmaps, I took the opportunity to switch to jsDelivr for packages that are not yet compatible with the new approach due to their complexity.

Enjoy the remainder of your Easter! 🐣


Migrating from Sidekiq to Solid Queue with Scheduling

This is going to be a more technical post than what you’re used to reading here, but it’s quite a big change, so I believe it’s the right place to share it.

Solid Queue

I, like many other Rails developers, was using Sidekiq for background jobs. It’s a great gem, but unless you’re paying for the pro version, it’s quite unsafe. When Sidekiq starts processing a job it removes it from Redis. If the worker is interrupted and/or killed before the job is finished, that job will be lost. The solution is upgrading to the pro version, but that is quite expensive for a small project like Visualizer.

So when David introduced Solid Queue at Rails World 2023, it looked like a perfect solution. It was lacking a few crucial things, though. The first one being a simple web interface to see the jobs. That got quickly resolved with Mission Control - Jobs. It’s very simple, but it gets the job done. No pun intended.

The other big thing was scheduling jobs. I was using sidekiq-scheduler for this before, and it’s a very important feature to have. So I was closely following the PR#155 and yesterday Rosa merged it in. 🥳

Literally minutes later Visualizer was running it in production, and it’s been working great since.

The Migration

At first, I had to point directly to the GitHub repository, but the updates have now been released, so all you need is to replace sidekiq* entries in your Gemfile with:

gem "solid_queue", ">= 0.3.0"
gem "mission_control-jobs", ">= 0.2.0"

Then you need to run bundle install, bin/rails generate solid_queue:install, and bin/rails db:migrate, to prepare the database and generate the configuration file.

After that you should set the queue adapter in config/environments/production.rb:

config.active_job.queue_adapter = :solid_queue

Finally, you need to expose jobs in your config/routes.rb. Ideally behind some kind of authentication:

authenticate :user, ->(user) { user.admin? } do
  mount MissionControl::Jobs::Engine, at: "/jobs"
end

At this point, you can find and delete the rest of sidekick related code in your project, and you should be good to go.

Configuring Recurring Jobs

The scheduling is done in config/solid_queue.yml and it’s quite flexible. The schedule key can take anything Fugit::Cron can parse. The class is the job you want to run and the optional args are the arguments you can pass to the job. It can even take kwargs as the last parameter. That all goes under the recurring_tasks inside dispatchers. Here’s an example from Visualizer:

default: &default
  dispatchers:
    - polling_interval: 1
      recurring_tasks:
        shared_shot_cleanup_job:
          class: SharedShotCleanupJob
          schedule: "@hourly"
        duplicate_stripe_subscriptions_job:
          class: DuplicateStripeSubscriptionsJob
          schedule: "@daily"
        airtable_webhook_refresh_all_job:
          class: AirtableWebhookRefreshAllJob
          schedule: "0 0 */6 * *"

Error Handling

Because Solid Queue is a new gem, you can’t assume your error tracking service will support it out of the box. For example, I’m using AppSignal, and they don’t support it yet. There’s an open issue, but not much progress yet. At least not publicly.

Luckily this gem is build by Rails core team, so they provide a very simple on_thread_error configuration option. In config/environments/production.rb, right under the queue adapter configuration, you can add:

config.solid_queue.on_thread_error = ->(error) { Appsignal.send_error(error) }

And for errors that happen in jobs, you can hook in to ApplicationJob and ApplicationMailer adding this snippet:

rescue_from(Exception) do |exception|
  Appsignal.send_error(exception)
  raise exception
end

This will send all Solid Queue and job errors to AppSignal, but you can easily change it to your own error tracking service.

Conclusion

It has not even been a day, but I’m already very happy with the switch. If nothing else it decreases my dependence on Redis. Not that I have anything against Redis, but it’s one more thing to maintain and keep an eye on.

Now the last piece of the puzzle will be switching to Solid Cache, and then I can finally say goodbye to Redis. Less service dependencies, fewer things to worry about, better uptime, happier users. 🚀

In Other News

As usual, there were many other changes since the last update, with the most significant being the introduction of a daily cap of 50 shots on the free plan. This adjustment is expected to affect only a very small fraction of users. However, those who exceed this limit are encouraged to upgrade to the premium plan, as their usage places considerable strain on the database.

Thank you! 🙏


Speeeeeeeeed

Long time no post 👋

There was nothing major enough worth announcing, so I didn’t write any post. But now I see I’ve done quite a lot since the last update, so let’s go through it all.

To first address the title of the post: I’ve made significant changes to how certain Shot information is stored and indexed, with the end result being a vastly faster and thus superior Community search experience. The query time went from ~2s to ~50ms. It completely transforms how useful that page is.

I’ve also sped up my tests significantly, but that doesn’t really affect you. 😅

Recently John started adding API endpoints to Decent, so now you can also log in to Decent. You can do that by clicking on your avatar in the upper right, and then clicking Connect with Decent. Right now it doesn’t do anything but display serial number(s) of your machine(s) in your Edit profile, but I’m sure there’ll be more for me to do as John adds more API endpoints.

Another thing I’ve missed since the v4 redesign is opening shots in new tabs. You can do this now by cmd/ctrl clicking on any shot in the list. In some browsers even middle click works, but that’s outside of my control since browsers are quite strict what kind of manipulations they allow in JavaScript.

I’ve also made a massive change importing and parsing Beanconqueror shots with #98. These are now imported exactly as they are and parsing for charts is done on the fly. I also display all the other data from the app in a much nicer way now. This are all the first foundational steps for what will be a new shot file format. It’s a collaboration between many parties, and there’s a ton still left to do, but I’m already super excited about the future. 🥳