

Yeah, Beehaw doesn’t have anywhere near that MAU, but I cannot fathom how it is costing him $5k a month to run a few instances, a few of which have many less MAU.
Bit-breaker working in cybersecurity/IT. Only languages I know are English and Programming ones.
Sometimes I write things about technology.
If I told you the SHA256 for this sentence starts with 'c, 5, four, a, and a', would you believe me?
Yeah, Beehaw doesn’t have anywhere near that MAU, but I cannot fathom how it is costing him $5k a month to run a few instances, a few of which have many less MAU.
Usenet is not the WWW. It operates on a different protocol and methods. It’s not over HTTP(s) like this or standard web sites. Thus, your typical web user won’t even ever see or notice anything on or about usenet. This is why you would need a usenet provider or access. You cannot access usenet with a web browser. One notable difference between the web and Usenet is the absence of a central server and dedicated administrator. Usenet is distributed among a large, constantly changing conglomeration of servers that store and forward messages to one another in so-called news feeds. Individual users may read messages from and post messages to a local servers.
Usenet is literally just a collection of text files on various servers or locations. There really isn’t an index builtin or a way to just ‘click to the next page’. This is why you need an indexer. An indexer crawls and scrapes usenet headers to allow searching and finding of specific content or posts. It automatically builds releases and indexes them like google indexes the internet.
When someone uploads files to usenet, it’s just text. Very large files such as videos, aren’t easily represented as text and don’t “fit” in one post. It is spread over many different posts, sometimes hundreds. In text format. You could find all those posts, combine the text, and end up with an actual video or music file. But that file doesn’t “exist” on usenet as a specific, single, item. Indexers find all posts associated with something you may be searching for, and other news reader software (like NZBGet or SABNZB) combine all those text files into one, giving you the file you actually expect after downloading all the different posts/parts.
Using a VPN account with usenet is beneficial, but not required. It is ideal to have access to multiple different indexers to find the posts you want.
Yes, you can. Though as another commenter mentioned, doing it like you currently attempted to, is way too much of a hassle.
That is, don’t try and negotiate your own SMTP session and content. Being POP_OS is Ubuntu based, you should be able to use the mail
command from mailutils
package
echo "Is this working?" | mail -s "Subject" [email protected]
Also might want to consider something like Apprise (Everything and the kitchen sink) or NTFY (Does one thing, does it well) for other types of notification methods.
Speaking as an admin of an instance here, 33 requests a minute is not “all good”.
33 HTTP GET requests per minute to a given instance.
That is way beyond acceptable use, and would likely have your service blocked. There exists these services too :
https://lemmy.fediverse.observer/stats
Maybe those do what you’re trying to do?
There is not an “admin inbox” for lemmy instances. You can hit the endpoint /api/v3/site
for information about an instance including the admins list.
Please; no.
Which one? There were multiple links in that comment.
Because at the end of the day TypeScript is still Javascript and it’s still bad. Just has some verbose formats to try and make weakly typed language (javascript) appear to be strongly typed. It adds more build steps to what shouldn’t be there; build steps make sense for apps, they make much less sense for libraries.
https://dev.to/bettercodingacademy/typescript-is-a-waste-of-time-change-my-mind-pi8
https://medium.com/@tsecretdeveloper/typescript-is-wrong-for-you-875a09e10176
It’s because of a filter on the Easylist blocklist. To fix the improper block without removing the entire list or rule:
my rules
* https://startrek.website/pictrs/image/ xmlhttprequest allow
save
, then click commit
to make it survive restarts.^https?:\/\/.*\.(club|bid|biz|xyz|site|pro|info|online|icu|monster|buzz|website|biz|re|casa|top|one|space|network|live|systems|ml|world|life|co)\/.*\/$~image,~media,~subdocument,third-party,domain=(list)
Building is different than doing a compose up
. If you’re making changes to the dockerfile and want to build your changes, then DOCKER_BUILDKIT=1 docker build -t custom/lemmy: .
should be the correct method.
Then you would change the docker-compose file to reference your newly built image instead of using the docker hub one. See this document.
Bug fixes. Too many to count.
Same. It’s not a perfect language, far from such. It is simple but not easy. I too believe it helps with understanding exactly what the code is doing via being lower level.
C is an old language lacking many, many, many modern features. One of the features it does not lack is encapsulation and isolation.
Problem is teaching non techies how to use that static site generator. Start talking about html or git and the eyes will glaze over. Definitely not sustainable.
Definitely. vim is hard to get used to, but after you do, it’s damn powerful especially with plugins. Always nice to be able to do typing and coding entirely on the keyboard and not needing to move your hands to the mouse for something. Also, if you do any Linux cli stuff, you almost always have access to vi at LEAST. So being familiar with the tool she the gui and something like nano isn’t available, is invaluable.
:wq
Very nice write up explaining why you want this changed.
Can you share a minimal reproducible error of your Login functions?
Why would allies share their plans with an entity that has zero opsec, is probably hacked at the signal intelligence level, let alone with Russian puppet circus that is the US government?
Hard pass.