I love the indieweb philosophy. I'm a bit off the topic of the tech sites shared here, but I have a recipe site (mostly from around the world) with my partner. This is the domain (in spanish, sorry!): https://www.foodisea.com
This is amazing! We talked about recipes at Homebrew Website Club a few weeks ago. We asked ourselves "how could we share our own recipes online?"
The layout of your recipes and the focus on the process without extraneous information makes me want to learn more about cooking. I have always been scared off by recipe sites. Your site looks like a great entry point for me.
Congrats for shipping! I love the simplicity of the idea. It reminds me a lot to Drop to sell[0]. At first glance, both cover the same use case, or is there any significant difference?
We will handle the intl. taxation for sellers and hand out invoices to the buyers. With us you also don't have the hassle of creating your own Stripe account, API key etc etc. We learned the required expertise to set up multiple accounts when trying to sell a file was a huge pain point for creators.
> Psyco was a module that monkeypatched CPython to add a custom JIT compiler. Pyjion wants to introduce a proper C API for adding a JIT compiler to CPython instead of monkeypatching it. It should be noted the creator of Psyco went on to be one of the co-founders of PyPy.
> IronPython is an implementation of Python that is implemented using .NET. While IronPython tries to be usable from within .NET, Pyjion does not have a compatibility story with .NET. This also means IronPython cannot use C extension modules while Pyjion can.
I don't think pretrained models are "beginner building blocks". They provide a kind of "base model" that can be fine tuned to your specific need. The advantage of not training the full model means you can save many resources (in design time and computation).
There are many boring but meaningful tasks in which this can be used. For example, I'm sure many industries could be benefit from image classification for very specific cases (e.g., fruit categorization). In those cases, you are not interested in the classification of "general" objects such as car, person, bike or horse (as provided by MobiletNet pretrained model), but you can use that model as base to classify different categories of fruit.
Therefore, you are right that they might not be very useful to build new ML algorithms or network architectures. But they are useful to build specific (and novel) uses of current neural networks.
Just a side note. Probably not very fair to attribue the introduction of experience replay to DQN authors. This technique was popularized in 1992 by Lin[1] and used later in many works to improve performance. The difference, perhaps, is that in DQN, experience replay is not only a way to improve performance, but it's a must to obtain stability.
Related page with another API about Coronavirus: https://covid19api.com/
I haven't tried it yet, but it looks well designed. Data is sourced from Johns Hopkins.
Not obvious to me, but it means Total Fertility Rate. From wikipedia, TFR of a population is basically the average number of children that would be born to a woman over her lifetime.