I’m using CloudFlare to hide my home IP and to reduce traffic from clankers. However, I’m using the free tier, so how am I the product? What am I sacrificing? Is there another way to do the above without selling my digital soul?
I’m using CloudFlare to hide my home IP and to reduce traffic from clankers. However, I’m using the free tier, so how am I the product? What am I sacrificing? Is there another way to do the above without selling my digital soul?
I have never used it, so take this with a grain of salt, but last I read, with the free tier, you could not secure traffic between yourself and Cloudflare with your own certs which implies they can decrypt and read that traffic. What, if anything, they do with that capability I do not know. I just do not trust my hosted assets to be secured with certs/keys I do not control.
There are other things CF can do (bot detection, DDoS protection, etc), but if you just want to avoid exposing your home IP, a cheap VPS running Nginx can work the same way as a CF tunnel. Setup Wireguard on the VPS and have your backend servers in Nginx connect to your home assets via that. If the VPS is the “server” side of the WG tunnel, you don’t have to open any local ports in your router at all. I’ve been doing that, originally with OpenVPN, since before CF tunnels were ever offered as a service.
Edit: You don’t even need WG, really. If you setup a persistent SSH tunnel and forward / bind a port to your VPS, you can tunnel the traffic over that.
I don’t get this whole expose my IP. It’s not a secret and people.are scanning it neither you have a port open or not. The whole IPv4 range is constantly being scanned.
I have the same setup but using frp which stands for fast reverse proxy.
The term VPN is pure marketing bs. What is called VPN today used to be called Proxy Server.
I’ve also heard good things about using Pangolin for the same setup.
Good luck I’m behind nine proxies!
I used to use HAProxy but switched to Nginx so I could add the modsecurity module and run WAF services. I still use HAProxy for some things, though.
Oh I forgot to say: I have crowdsec on the VPS in front of frp and traefik on the server at my home, where I add all the modules I want.
frp just pipes all the packets through transparently.
But yeah, same thing, should work the same and there are dozens of ways to set that all up.
I’ve been looking into crowdsec for ages now and still haven’t gotten around to even a test deployment. One of these days, lol, and I’ll get around to it.
It’s pretty neat and I feel like there is a clear value exchange for both parties in the free tier, so less shady than cloudflare.
Don’t see an issue yet even though they are crowdsourcing their list generation. At least they are giving you something for it or you can take it. But if you do you get smaller lists.
Perhaps if you are only talking about the consumer level stuff advertised on TV. Otherwise I can assure you that “Virtual Private Networks” are a real thing that have absolutely nothing to do with Proxy Servers.
On down the comment chain you mention "…our computers would not see each other and would not be able to connect to each other via that service. " as some kind of test of whether a thing is a VPN or Proxy Service but what you’re missing is that this is a completely common and advisable configuration for companies. In fact Zero Trust essentially demands configurations like this. When Bob from Marketing fires up his VPN to the Corporate Office he doesn’t need access to every server and desktop there nor does his laptop need to be able to access the laptops of other VPN users. They get access to what they need and nothing more.
Hell the ability to access the internet via the tunnel, called Split Tunneling, is also controllable.
It’s that ability to control where the tunnel terminates that allows consumer VPNs, like Proton, to be used the way they are.
So while private individuals absolutely do use VPNs as an ersatz replacement for Proxy Servers they are nowhere near the whole use case for VPNs.
you can do the same split tunneling via proxy servers
I agree. That also means that for certain usecases they are equivalent. It’s sometimes worth checking all options to find the best one for that specific case.
In my experience even a site with low legitimate traffic will eventually buckle under the torrent of bots and scrapers if it’s up long enough to get indexed by search engines, so the longer my stuff is out there the more I anticipate I will need DDoS protection.
I’ve run a publicly accessible low-legitimate-traffic website that has been indexed by Google and others from my home network for >20 years without anything buckling so far. I don’t even have a great connection (30mbps upstream).
Maybe I’m just lucky?
I’ve got bot detection setup in Nginx on my VPS which used to return 444 (Nginx for "close the connection and waste no more resources processing it), but I recently started piping that traffic to Nepenthes to return gibberish data for them to train on.
I documented a rough guide in the comment here. Of relevance to you are the two
.conffiles at the bottom. In thedeny-disallowed.conf, change the line forreturn 301 ...toreturn 444I also utilize firewall and fail2ban in the VPS to block bad actors, overly-aggressive scrapers, password brute forces, etc and the link between the VPS and my homelab equipment never sees that traffic.
In the case of a DDoS, I’ve done the following:
Granted, I’m not running anything mission-critical, just some services for friends and family, so I can deal with a little downtime.
I have something similar with fail2ban + hidden buttons. If the requester goes and clicks on the hidden buttons on the main site, it gets into a rabbit hole. After 3 requests, it gets banned for a bit. Usually stops the worst offenders. OpenAI and some of the scrapers are the worst.
Google/bing, I do actually see them hit robots.txt then jump off, which is what they should be going.
Oooooh. That’s smart. I mostly host apps, but in theory, I should be able to dynamically modify the response body and tack on some HTML for a hidden button and do that.
I used to disallow everything in robots.txt but the worst crawlers just ignored it. Now my robots.txt says all are welcome and every bot gets shunted to the tarpit 😈
Nice! Thats another way to do it. 😀
I know others use Arabis(?) I think thats what it called. The anime girl one that does a calc on top. Ive never had good luck with it. I think bot are using something to get around and it messes with my requests. Might also be my own fiddling.