101010.pl is one of the many independent Mastodon servers you can use to participate in the fediverse.
101010.pl czyli najstarszy polski serwer Mastodon. Posiadamy wpisy do 2048 znaków.

Server stats:

486
active users

#chatbots

2 posts2 participants0 posts today
Replied in thread

@alternativeto

No problem. I contacted them at support@protonme.zendesk.com, and they answered very quickly(a couple of hours):

"Hello,

Thank you for reaching out.

Please note that we typically add the open-source code repository a few weeks after releasing a product, so it should be available soon.
You can keep an eye on it here: proton.me/community/open-source

Thank you for your patience.

If you have any other questions or concerns, feel free to let us know.

Kind regards,

Nikola L.
Customer Support
Proton Mail"

So, let’s wait 🙂

ProtonAn Open Source Privacy Company | ProtonAll our apps (Proton Mail, Proton Drive, etc) are open source and independently audited. Anyone can inspect our software and confirm our encryption works.
#AI#LLM#LLMs

"Simon Willison has a plan for the end of the world. It’s a USB stick, onto which he has loaded a couple of his favorite open-weight LLMs—models that have been shared publicly by their creators and that can, in principle, be downloaded and run with local hardware. If human civilization should ever collapse, Willison plans to use all the knowledge encoded in their billions of parameters for help. “It’s like having a weird, condensed, faulty version of Wikipedia, so I can help reboot society with the help of my little USB stick,” he says.

But you don’t need to be planning for the end of the world to want to run an LLM on your own device. Willison, who writes a popular blog about local LLMs and software development, has plenty of compatriots: r/LocalLLaMA, a subreddit devoted to running LLMs on your own hardware, has half a million members.
For people who are concerned about privacy, want to break free from the control of the big LLM companies, or just enjoy tinkering, local models offer a compelling alternative to ChatGPT and its web-based peers.

The local LLM world used to have a high barrier to entry: In the early days, it was impossible to run anything useful without investing in pricey GPUs. But researchers have had so much success in shrinking down and speeding up models that anyone with a laptop, or even a smartphone, can now get in on the action. “A couple of years ago, I’d have said personal computers are not powerful enough to run the good models. You need a $50,000 server rack to run them,” Willison says. “And I kept on being proved wrong time and time again.”"

technologyreview.com/2025/07/1

MIT Technology Review · How to run an LLM on your laptopBy Grace Huckins