Defeating Spam and Saboteurs

Defeating Spam and Saboteurs
----------------------------

The localized focus of this system ensures that one bad user, far away,
cannot effectively interfere with the system by the insertion of masses
of spam.

Far away data will generally pass through a number of discriminating
computers prior to reaching its destination. The custom rule set of
any one of the servers may be sufficient to stop the spam. The headers
can be used to show how the packages travelled, in the manner of an
SMTP server, such that data can be considered preferred once it
passes through certain systems. Data that has not touched the internet
can be considered to be more reliable, and servers may choose to
specialize in certain propagations. For instance, some may be
inclined towards local data only, and others may focus upon e-mail
propagation, or even high quality e-mail propagation, with a whitelist
of senders or receivers.

In the same way that the ham radio community yields members who are
eager to serve by providing repeaters, caring individuals will choose
to provide direct long distance links on the internet or between
regions. Their systems are likely to specialize in spam destruction,
particularly for the sake of saving bandwidth.

Saboteurs may decide to draw users towards applications that refuse
to properly propagate data. Bad end users may even choose to receive
instead of giving. Missing data that is of quality will then become
in demand, and those who provide it will be sought out. Internet
hosts who provide reliable information will be able to entice
users to themselves and their other services. Local hosts who
provide quality data will draw attention and perhaps customers
for their businesses. Municipal hosts will have a duty to provide
quality data.

Nodes should consider testing the computers that they connect
to for willingness to yield data. A willingness to give data to
other systems based upon their willingness to yield data will
flush out stingy users. One scheme would involve offering a few
entries before requesting some in return, and refusing to provide
more unless the other system proves that it is not stingy. Some
data should be offered first, in case the other system is new and bare.


Entries should be rehashed on receipt, in case of fraudulent or
erroneous hashing, and headers should be hashed like regular entries.
Data that was missing its descriptors should be pushed to the bottom of
the file. Some systems will be configured to delete such data,
considering it a waste of space.

Good host/client software will allow the user to limit the number
of entries (or the quantity of data in general) to accept from one
location within a certain amount of time. With a dense local network,
a tight limit will ensure that flooding spam will be strongly restricted.
With a looser network, a much larger limit should be set because of data
scarcity. The use of an alarm to instantly detect and notify of
connection to a bad node should be considered. A keyed MAC address
blacklist can be voted upon and added to the data.

A note on key signing:
In order to stop spam from coming to individual recipients
or even to neighborhood classifieds, PGP keying can be used
to verify identities. Any message or header whose source
wishes to prove itself can do so with key signing. The worst of
flooding and spam attacks can be neutralized with keys. A local
keyserver can be used if internet is unavailable. Distributed
key submission and approval is even possible inside of this
system.

Return to the main page.

Hermann L. Johnson. January 2019. Free for unmodified redistribution and non-commercial use.