What's Up With the Latest Glitches

Some (or many) of you might have noted or suspected recent site anomalies besides the failure of the top-ten scores to update without refreshing the screen. E.G.

  • A familiar calibration you were sure was flowing claimed to be a stall, but showed the stall location far outside the target oval
  • Annotating a movie you thought you recognized as a calibration only gave you 116 or fewer points
  • What you thought was a “real” movie from the dataset was scored as a calibration
  • Sometimes refreshing the screen, especially with a calibration movie, results in a new movie being displayed - other times not

Anyway, Pietro graciously provided an update on what was going on, and said I should share it with other catchers who might have similar questions. Following is a slightly edited version of what he provided.

One possible culprit for these mysterious changes is that we had a large segment of our code that was modified when (our former coder) left the project, but had never been properly compiled. There were also changes in preparation of a new, improved interface design. Recently, we improved automation in our process for updating code on the servers. We did not realize that this suddenly compiled all the changes the former coder had made before she left, many of which introduced (or reverted to) odd behaviors. We are now systematically trying to fix these.

Additionally, our new developers wish to rewrite the entire code base from scratch to make it easier to understand and maintain the code, and to create new features. (Any of you who’ve ever done any coding know how hard it can be to understand another coder’s work.) This means that they are dividing their time between addressing issues such as the ones you encountered on the legacy code, and building something newer and “shinier” from scratch. So we have a bit of a balancing act in terms of allocating dev resources. On top of that we are supporting an experiment in human/AI cooperation that will be run separately from Stall Catchers, but using Stall Catchers movies and open to any catchers who want to participate. This will probably run in early to mid December. Of course this is pulling some dev resources aside as well, but supporting this effort helps us continue to keep Stall Catchers running now and in the future.

Much of this may not be directly pertinent to your specific observations, but (Pietro) wanted to share the broader context so you know what’s going on.

1 Like

Thanks so much for posting this @caprarom :purple_heart:

Thanks Caprarom

Just an addendum related to Certificate Errors, for those who might be interested. Precipitated by the certificate error today on the Dream Catchers site (which the team is working on), Pietro shared the following information:

"We are trying to establish a rhythm on this. We recently (end of last year) purchased a service that auto-renews our certificates. At the time, I thought that would prevent situations like this. As it turns out, it’s not renewing the existing certificates, it’s obtaining new ones each time, that have to be installed manually. And we have several sites using these. So we are trying to sync up the renewal dates so that every two months, I can just go through and do them all in one fell swoop before anything goes offline or generates certificate errors for users. Ideally, we’ll come up with an automated system for installing the new certificates … "

Pietro goes on to say that they are working to get the new version of Stall Catchers up and running, and creating a special new feature for an upcoming Dream Catchers event. These upgrades, of course, remain high priority.

Mike C.

1 Like

Some of you might soon notice a change in the quantity of points received upon redemption. This is because Admin has found a fix for one of the bugs affecting our legacy software, and it should soon be implemented.

You know that the points you receive upon answering a calibration movie correctly or annotating a “real” movie from the current data set is directly dependent upon your skill level. For example, if your blue skill bar is at max, you get 116 points for annotating a real movie, but only receive 58 points if your skill bar is at half-mast. The same is supposed to be true when you redeem points, but that has not been the case lately due to the bug. We’ve all been getting the maximum points for redemption (2320 points per pending movie) regardless of our skill level.

With this correction, newer / less experienced players will likely notice the change in point redemptions. Seasoned veterans with high skill levels won’t notice much of a change.

I hope this explanation is clear, but don’t hesitate to ask for more clarification if needed.

Mike C.

1 Like