More often than not: “OP deletes account and content, haha suckers”
I’m surprisingly level-headed for being a walking knot of anxiety.
Ask me anything.
Special skills include: Knowing all the “na na na nah nah nah na” parts of the Three’s Company theme.
I also develop Tesseract UI for Lemmy/Sublinks
Avatar by @SatyrSack@feddit.org
More often than not: “OP deletes account and content, haha suckers”


Like you’re thinking: put HAProxy on your OpenWRT router.
That’s what I do. The HAProxy setup is kind of “dumb” L7 only (rather than HTTP/S) since I wanted all of my logic in the Nginx services. The main thing HAProxy does is, like you’re looking for, put the SPOF alongside the other unavoidable SPOF (router) and also wraps the requests in Proxy Protocol so the downstream Nginx services will have the correct client IP.
Flow is basically:
LAN/WAN/VPN -> HAProxy -> Two Nginx Instances -> Apps
With HAProxy in the router, it also lets me set internal DNS records for my apps to my router’s LAN IP.


Depends on the device.
Some I flash LineageOS which requires an extra step to install MicroG or the Google Services. I just skip that optional step.
Some that aren’t supported by LineageOS but are able to be rooted: I just remove / disable all the Google services and apps.
Even on stock phones that can’t be rooted or bootloader unlocked, you can still disable things like Google Play Services. You may have to deal with and manually silence notifications saying Play Services are unavailable.
In the latter two cases, you’ll often need to find alternative apps (like the phone dialer, SMS messenger, etc) before you can disable the built-in Google ones.


Years ago, before I de-googled, I set the language in Android to “English (UK)” so the assistant voice would be British. The side effect of that was minor UI labels such as my “Flashlight” quick action changing to “Torch”.


Are screenshots of other platforms like Reddit, X, Bluesky ‘off topic’? Or, do they count as memes?
I’m not a fan of them but they’re not something I would report as off-topic. But given the chance to voice an opinion on them, I’d say they’re more appropriate for the “people Twitter”-style social-media inception communities.
- Does a meme require an attempt at humor?
I would say yes. I don’t subscribe here to be preached at. But there’d be some potentially awkward judgement calls for the purposes of enforcing that.
- What about news? (What if a headline is funny?)
There’s !nottheonion@lemmy.world and similar for those and news/politics permeates everything. It’s nice to have a place away from that.
- Is an image required? Should a picture of text (not from another platform) be removed?
That’s a tough one. I’ve seen pictures of book text being presented, but that often (but not always) bumps up against #2 (attempt at humor)


I remember both of those going downhill in real time but the one that hit me the hardest was Discovery Channel. One day it was Bill Nye, Mythbusters, Shark Week, and science documentaries and the next it was Pawn Stars, Swamp Loggers, and American Chopper. At least we got a nice meme template out of that last one.
Fuck it, it’s Christmas Tuesday.


I’ve been looking into crowdsec for ages now and still haven’t gotten around to even a test deployment. One of these days, lol, and I’ll get around to it.


Oooooh. That’s smart. I mostly host apps, but in theory, I should be able to dynamically modify the response body and tack on some HTML for a hidden button and do that.
I used to disallow everything in robots.txt but the worst crawlers just ignored it. Now my robots.txt says all are welcome and every bot gets shunted to the tarpit 😈


I’ve got bot detection setup in Nginx on my VPS which used to return 444 (Nginx for "close the connection and waste no more resources processing it), but I recently started piping that traffic to Nepenthes to return gibberish data for them to train on.
I documented a rough guide in the comment here. Of relevance to you are the two .conf files at the bottom. In the deny-disallowed.conf, change the line for return 301 ... to return 444
I also utilize firewall and fail2ban in the VPS to block bad actors, overly-aggressive scrapers, password brute forces, etc and the link between the VPS and my homelab equipment never sees that traffic.
In the case of a DDoS, I’ve done the following:
Granted, I’m not running anything mission-critical, just some services for friends and family, so I can deal with a little downtime.


I used to use HAProxy but switched to Nginx so I could add the modsecurity module and run WAF services. I still use HAProxy for some things, though.


I have never used it, so take this with a grain of salt, but last I read, with the free tier, you could not secure traffic between yourself and Cloudflare with your own certs which implies they can decrypt and read that traffic. What, if anything, they do with that capability I do not know. I just do not trust my hosted assets to be secured with certs/keys I do not control.
There are other things CF can do (bot detection, DDoS protection, etc), but if you just want to avoid exposing your home IP, a cheap VPS running Nginx can work the same way as a CF tunnel. Setup Wireguard on the VPS and have your backend servers in Nginx connect to your home assets via that. If the VPS is the “server” side of the WG tunnel, you don’t have to open any local ports in your router at all. I’ve been doing that, originally with OpenVPN, since before CF tunnels were ever offered as a service.
Edit: You don’t even need WG, really. If you setup a persistent SSH tunnel and forward / bind a port to your VPS, you can tunnel the traffic over that.


I’m always wary of any .news domain because literally anyone can run a blog and buy a cheap .news domain to point to it.
My record is 9.9 years (and going). This is my old Thinkpad T420 which lives on my equipment rack to act as a SSH / web console for my equipment. I just close the lid and put it to sleep when it’s’ not in use. It doesn’t even connect to the internet, just my isolated management VLAN.

My HomeAssistant server (also an old ThinkPad) is the next longest at just under a year. It also lives on an isolated VLAN.
Both of these are repurposed laptops with batteries in good condition and thus have built-in UPS (in addition to the UPS they’re plugged into).
The rest average about 4-7 months depending on power outages (rare but when they do occur, they’re longer than my UPS can provide) and rebooting for kernel updates.
You can also self host it: https://github.com/schlagmichdoch/pairdrop


This probably isn’t the answer you want to hear, but yes, I still consider it slop.
Not everyone is an artist, and that’s okay. Just do your best, and even the worst chicken-scratch doodles are better than what any AI craps out.


Not before sleep, but sometimes I’ll wake up from a dream with a deep feeling of inspiration. On the few times I’ve written down what I dreamed and thought was so insightful/promising/solution to a problem, I read it in the morning and was like “this is the dumbest thing I’ve ever read”.


So, I set this up recently and agree with all of your points about the actual integration being glossed over.
I already had bot detection setup in my Nginx config, so adding Nepenthes was just changing the behavior of that. Previously, I had just returned either 404 or 444 to those requests but now it redirects them to Nepenthes.
Rather than trying to do rewrites and pretend the Nepenthes content is under my app’s URL namespace, I just do a redirect which the bot crawlers tend to follow just fine.
There’s several parts to this to keep my config sane. Each of those are in include files.
An include file that looks at the user agent, compares it to a list of bot UA regexes, and sets a variable to either 0 or 1. By itself, that include file doesn’t do anything more than set that variable. This allows me to have it as a global config without having it apply to every virtual host.
An include file that performs the action if a variable is set to true. This has to be included in the server portion of each virtual host where I want the bot traffic to go to Nepenthes. If this isn’t included in a virtual host’s server block, then bot traffic is allowed.
A virtual host where the Nepenthes content is presented. I run a subdomain (content.mydomain.xyz). You could also do this as a path off of your protected domain, but this works for me and keeps my already complex config from getting any worse. Plus, it was easier to integrate into my existing bot config. Had I not already had that, I would have run it off of a path (and may go back and do that when I have time to mess with it again).
The map-bot-user-agents.conf is included in the http section of Nginx and applies to all virtual hosts. You can either include this in the main nginx.conf or at the top (above the server section) in your individual virtual host config file(s).
The deny-disallowed.conf is included individually in each virtual hosts’s server section. Even though the bot detection is global, if the virtual host’s server section does not include the action file, then nothing is done.
Note that I’m treating Google’s crawler the same as an AI bot because…well, it is. They’re abusing their search position by double-dipping on the crawler so you can’t opt out of being crawled for AI training without also preventing it from crawling you for search engine indexing. Depending on your needs, you may need to comment that out. I’ve also commented out the Python requests user agent. And forgive the mess at the bottom of the file. I inherited the seed list of user agents and haven’t cleaned up that massive regex one-liner.
# Map bot user agents
## Sets the $ua_disallowed variable to 0 or 1 depending on the user agent. Non-bot UAs are 0, bots are 1
map $http_user_agent $ua_disallowed {
default 0;
"~PerplexityBot" 1;
"~PetalBot" 1;
"~applebot" 1;
"~compatible; zot" 1;
"~Meta" 1;
"~SurdotlyBot" 1;
"~zgrab" 1;
"~OAI-SearchBot" 1;
"~Protopage" 1;
"~Google-Test" 1;
"~BacklinksExtendedBot" 1;
"~microsoft-for-startups" 1;
"~CCBot" 1;
"~ClaudeBot" 1;
"~VelenPublicWebCrawler" 1;
"~WellKnownBot" 1;
#"~python-requests" 1;
"~bitdiscovery" 1;
"~bingbot" 1;
"~SemrushBot" 1;
"~Bytespider" 1;
"~AhrefsBot" 1;
"~AwarioBot" 1;
# "~Poduptime" 1;
"~GPTBot" 1;
"~DotBot" 1;
"~ImagesiftBot" 1;
"~Amazonbot" 1;
"~GuzzleHttp" 1;
"~DataForSeoBot" 1;
"~StractBot" 1;
"~Googlebot" 1;
"~Barkrowler" 1;
"~SeznamBot" 1;
"~FriendlyCrawler" 1;
"~facebookexternalhit" 1;
"~*(?i)(80legs|360Spider|Aboundex|Abonti|Acunetix|^AIBOT|^Alexibot|Alligator|AllSubmitter|Apexoo|^asterias|^attach|^BackDoorBot|^BackStreet|^BackWeb|Badass|Bandit|Baid|Baiduspider|^BatchFTP|^Bigfoot|^Black.Hole|^BlackWidow|BlackWidow|^BlowFish|Blow|^BotALot|Buddy|^BuiltBotTough|
^Bullseye|^BunnySlippers|BBBike|^Cegbfeieh|^CheeseBot|^CherryPicker|^ChinaClaw|^Cogentbot|CPython|Collector|cognitiveseo|Copier|^CopyRightCheck|^cosmos|^Crescent|CSHttp|^Custo|^Demon|^Devil|^DISCo|^DIIbot|discobot|^DittoSpyder|Download.Demon|Download.Devil|Download.Wonder|^dragonfl
y|^Drip|^eCatch|^EasyDL|^ebingbong|^EirGrabber|^EmailCollector|^EmailSiphon|^EmailWolf|^EroCrawler|^Exabot|^Express|Extractor|^EyeNetIE|FHscan|^FHscan|^flunky|^Foobot|^FrontPage|GalaxyBot|^gotit|Grabber|^GrabNet|^Grafula|^Harvest|^HEADMasterSEO|^hloader|^HMView|^HTTrack|httrack|HTT
rack|htmlparser|^humanlinks|^IlseBot|Image.Stripper|Image.Sucker|imagefetch|^InfoNaviRobot|^InfoTekies|^Intelliseek|^InterGET|^Iria|^Jakarta|^JennyBot|^JetCar|JikeSpider|^JOC|^JustView|^Jyxobot|^Kenjin.Spider|^Keyword.Density|libwww|^larbin|LeechFTP|LeechGet|^LexiBot|^lftp|^libWeb|
^likse|^LinkextractorPro|^LinkScan|^LNSpiderguy|^LinkWalker|msnbot|MSIECrawler|MJ12bot|MegaIndex|^Magnet|^Mag-Net|^MarkWatch|Mass.Downloader|masscan|^Mata.Hari|^Memo|^MIIxpc|^NAMEPROTECT|^Navroad|^NearSite|^NetAnts|^Netcraft|^NetMechanic|^NetSpider|^NetZIP|^NextGenSearchBot|^NICErs
PRO|^niki-bot|^NimbleCrawler|^Nimbostratus-Bot|^Ninja|^Nmap|nmap|^NPbot|Offline.Explorer|Offline.Navigator|OpenLinkProfiler|^Octopus|^Openfind|^OutfoxBot|Pixray|probethenet|proximic|^PageGrabber|^pavuk|^pcBrowser|^Pockey|^ProPowerBot|^ProWebWalker|^psbot|^Pump|python-requests\/|^Qu
eryN.Metasearch|^RealDownload|Reaper|^Reaper|^Ripper|Ripper|Recorder|^ReGet|^RepoMonkey|^RMA|scanbot|SEOkicks-Robot|seoscanners|^Stripper|^Sucker|Siphon|Siteimprove|^SiteSnagger|SiteSucker|^SlySearch|^SmartDownload|^Snake|^Snapbot|^Snoopy|Sosospider|^sogou|spbot|^SpaceBison|^spanne
r|^SpankBot|Spinn4r|^Sqworm|Sqworm|Stripper|Sucker|^SuperBot|SuperHTTP|^SuperHTTP|^Surfbot|^suzuran|^Szukacz|^tAkeOut|^Teleport|^Telesoft|^TurnitinBot|^The.Intraformant|^TheNomad|^TightTwatBot|^Titan|^True_Robot|^turingos|^TurnitinBot|^URLy.Warning|^Vacuum|^VCI|VidibleScraper|^Void
EYE|^WebAuto|^WebBandit|^WebCopier|^WebEnhancer|^WebFetch|^Web.Image.Collector|^WebLeacher|^WebmasterWorldForumBot|WebPix|^WebReaper|^WebSauger|Website.eXtractor|^Webster|WebShag|^WebStripper|WebSucker|^WebWhacker|^WebZIP|Whack|Whacker|^Widow|Widow|WinHTTrack|^WISENutbot|WWWOFFLE|^
WWWOFFLE|^WWW-Collector-E|^Xaldon|^Xenu|^Zade|^Zeus|ZmEu|^Zyborg|SemrushBot|^WebFuck|^MJ12bot|^majestic12|^WallpapersHD)" 1;
}
# Deny disallowed user agents
if ($ua_disallowed) {
# This redirects them to the Nepenthes domain. So far, pretty much all the bot crawlers have been happy to accept the redirect and crawl the tarpit continuously
return 301 https://content.mydomain.xyz/;
}
Let me first say that I love the idea of this phone and it breaking free of the “tall, skinny rectangle” form factor. The physical keyboard is a huge draw for me as well. However, there are some things on the software side that are definitely making me wary.
I don’t understand why it would limit apps. $499 is a lot to spend on a secondary device, and I don’t know that I’d want to EDC two devices. That’s a lie. I know I wouldn’t want to everyday carry two phones because I did that for work and absolutely hated it.
Most apps work fine on smaller screens. I’ve been daily driving a Cat S22 Flip with a portrait-oriented 480x640 screen for over a year, and most apps scale just fine.
I’ve at least heard of Niagara Launcher. Is that saying the only way to use those apps is through the launcher’s integration? That sounds shitty.
Dear god, no.
So, a feature that has existed for years but was taken away from us? My old OnePlus had a customizable RGB light which could be configured the same way. It was really handy, and I hated the “always on” display that replaced it. I could tell from the color and pattern what kind of notification it was without having to preview it which was nice as it didn’t stress me out with a need to reply.
I want to like this, but it seems like they’re being very opinionated on how you actually use it. Maybe we’ll get lucky and it’ll be bootloader unlockable and LineageOS can save it from the shitty decisions of the manufacturer.
Edit: Submitted a question/ticket to ask support. Every time a promising-looking device is announced, I always ask. The answer is usually either “What? What do you mean?” or “No”. One of these days, there will hopefully be a manufacturer that doesn’t equate Android with Google.