Hosting
And tiny self-hosters, hobbyists or small-scale website content operators.
There are some problems associated with hosting which are not resilient to malfunctioning hierarchical systems like the US government of 2025.
Given a corporate hosting service, the corporation can decide to cooperate with a despotic regime, or capitulate to financial pressure from a monopolistic funding provider.
These are not hypothetical scenarios. Mastercard and Visa have forced content off the internet by refusing to provide financial services to entire industries.
Host Features
Dedicated hosts run by large companies are used routinely because they have the following characteristics:
Bandwidth
Hosts typically are set up to permit broad bandwidth access from clients, unlike home internet connections.
Resilience
Usually, there is some software abstraction between the host's bare metal and users of the host, meaning compute resources are virtualized, and management of the hardware is abstracted away from the host users.
Load Balancing & Domain Name Service (DNS)
Hosts, blurring with Content Delivery Network (CDN) providers, commonly have multiple DNS configurations permitting even DNS resolution to aid in balancing load and geographic optimization of delivering servers.
When an address is resolved, the handling server frequently reverse proxies, or even balances without TLS termination, to yet more servers. This, combined with multiple load balanced IP addresses per DNS entry, results in a very robust method of delivering static content.
Content Delivery
If demand for a data resource increases, Content Delivery Networks (CDN)s are able to scale in response to demand, and provide file serving close geographically to the demand sources.
This includes caching, and file redundancy - a CDN under significant demand for a set of files can be almost comparable to a file storage service because it provides redundant copies.
Host Failure Modes
A host's malicious behaviour, or consequences of using a host in light of others' malicious behaviour has multiple categories. Those affecting a static website's ability to keep responding to users are discussed below.
Cost
Even leaving aside problems caused by payment providers, hosts can generate significant financial costs for their users, driven by traffic to the user's content. Such traffic can either be an organic cost of the user's content being popular, or an artificial cost attributed to a distributed denial of service (DDOS) attack.
This hosting cost does not affect resilience, at the expense of increased billing for: * File Delivery - what I characterise here as costs associated with file transmission to clients, consisting of: * Bandwidth * Content Delivery * DNS - this is often also a cloud hosting billed service.
Because such costs are not directly controllable by the user, this makes commercial CDNs either an ineffective (by limiting cost) or a dangerous (financial liability) hosting solution.
I omit storage and compute here. Compute typically is not the bottleneck of a service providing files, so costs are mostly unaffected. Storage is completely unaffected unless some dynamic scaling has been configured. If you're doing dynamic scaling, I don't think you're a small operation, and this document is not about you.
Partial (File Delivery) Solution - 302 Temporary Redirection
A partial solution to the problem of traffic spikes compromising reliability for small services is a peer-to-peer (P2P) system of provisionally trusted nodes that can take on the task of hosting file content.
Such a network can make use of the HTTP 302 temporarily redirected code. This solution reaches its scaling limit when the peer that first held the content is using all of its resources to issue redirect responses. At this point, no further scaling can be achieved using only this technique.
Pros
- Uses existing protocols.
- Does not introduce new requirements on client applications like web browsers.
- Does nto introduce new requirements on web applications or static sites.
- Could potentially achieve a significant increase in traffic spike stability over lone-hosting solutions.
- Cheap. Existing small peers can band together, caching each other's files, and sharing bandwidth. Assuming traffic spikes affect small numbers of peers at a time, this is an essentially free bandwidth upgrade.
- Server-side operations serving the public are not typical small website concerns because of the limited server resources. This proposal fits well with the static hosting needs of small operations.
Cons
- Has a fixed limit to its scaling potential.
- Requires significant technical effort to establish and maintain trust between peers.
- It is not clear the theoretical maximum load configuration (peer serving the domain redirects all requests to other peers) is consistent with peer trust maintenance.
- Suffers from the peer discovery problem present in all P2P networks.
- May suffer from cascading failure as redirect destination peers stop participating to conserve their own resources.
Deplatforming
As mentioned, Mastercard and Visa deplatformed roughly 20k risque games on Steam.
Solution
There is no way to force a host to host anything. The best short-term way to be resilient against the hosts market dropping support for a customer is to have redundancy and routing between hosts such that a host's failure is automatically worked around.
Longer term, going outside jurisdictions or outside of usual hosting methods may be necessary, but that's not going to keep your pages online the day your host drops you.
This must apply to DNS as much as to file delivery. It's a harder problem, and isn't discussed here (for now).
Monitoring & Exfiltration
Hosts can monitor all traffic, and can transparently exfiltrate any information stored unencrypted in any medium. This includes RAM of hosted systems.
Solution for Monitoring & Exfiltration
The main consequence of this is that all data stored on a host must either be a) public or b) encrypted at all times.
Regarding b). this implies either the host is used only as a file store or, Homomorphic Encryption (HE) is employed.
Cryptpad uses approach a) to avoid trusting its host. The host is simply a file store, and all decryption and mutation operations reside on the client, where the decryption key must be present.
Furthermore, to avoid metadata extraction from request monitoring, clients must use VPNs when interacting with untrusted hosts.