Skip to primary content
Skip to secondary content

Natural Law And Issues of Life

Natural Law Issues Of Life Reality

Natural Law And Issues of Life

Main menu

Tag Archives: bots

Taxing Bots: Bringing Law to the Lawless Machine Crow

Posted on February 14, 2026 by jamesq
Reply

By James Quillian – Economist, Political Analyst, Teacher of Natural Law

Book Cover Full Size.pngFirst: Bots are not free. They are cheap for their owners and expensive for everyone else.

Second: The damage bots do is a classic negative externality – the costs are pushed onto society.

Third: Those who could control bots choose not to, because they profit from their lawless existence.

Fourth: The natural-law remedy is simple: tax bots at the points where they touch the system.

Fifth: Even “good” bots create externalities; they should carry their own weight instead of shifting the burden to their victims.

What a bot really is

A bot is nothing mystical. It is simply an automated agent that acts in place of a human being. It clicks, it posts, it scrapes, it buys, it sells, it pretends to be a person when it suits its owner’s purpose. The important thing is not the code, but the force behind it: whose will is being projected, and at whose expense.

From the standpoint of natural law, a bot is an extension of the human who deploys it. It is a tool, like a plow or a printing press. But unlike a plow, a bot can be multiplied by the thousands and unleashed across the world at almost no marginal cost. That is where the trouble begins.

“Free” is never free

In economics, whenever something appears to be free, you can be sure the cost has only been moved, not removed. Bots are a perfect example. The cost of operating a bot is so low that, for practical purposes, it rounds to zero for the operator. But the cost does not disappear. It is pushed outward onto:

  1. Attention: People must sift through noise, spam, and manipulation just to find what is real.
  2. Security: Systems must be hardened, monitored, and patched against automated attacks.
  3. Infrastructure: Networks, servers, and platforms must carry and process vast amounts of bot traffic.
  4. Trust: Public confidence in information, markets, and institutions is steadily eroded.

These are not imaginary costs. They are real drains on time, money, and social cohesion. That is what economists call a negative externality: a cost created by one party and forced onto others who did not consent and are not compensated.

Good bots, bad bots, and the shared burden

There are bots that perform useful functions. Search engine crawlers index the web. Monitoring bots watch for outages. Some bots help with accessibility or routine customer service. These are what people like to call “good bots.”

Then there are the “bad bots”: scrapers that steal content, armies of fake accounts that distort public opinion, scalpers that buy up tickets and products, automated fraud systems, and so on. Their purpose is to extract value by deception or force, not by honest trade.

But here is the key point: both kinds of bots impose costs on society. The good ones may be useful, but they still consume bandwidth, processing power, and human attention. They still require defenses, filters, and verification systems. The difference is not that good bots create no externalities, but that their externalities are smaller and sometimes offset by benefits.

Under natural law, the rule is simple: you carry the costs of the forces you unleash. If your tool imposes burdens on others, you are responsible for those burdens. That is the moral foundation for taxing bots.

Why bots are not reined in

It is important to understand that bots can be controlled. Platforms can identify automated traffic. Governments can require registration, licensing, and accountability. The technology to do this already exists. The absence of control is not a technical failure; it is a deliberate choice.

Who benefits from leaving bots largely lawless?

  1. Large platforms: Bot traffic inflates user counts, engagement metrics, and ad impressions. Bigger numbers justify higher valuations and more advertising revenue.
  2. Advertisers and marketers: Bots can amplify campaigns, manufacture trends, and simulate public enthusiasm at low cost.
  3. Political operators: Bots can flood the public square with noise, fear, and confusion, steering opinion without honest debate.
  4. Financial actors: Automated trading and market manipulation can move prices faster than human participants can react.
  5. Criminal networks: Bots scale fraud, identity theft, and other abuses the way factories scale production.

There is also a quieter benefit: fear itself is profitable. If people are frightened by scams, disinformation, and cyber attacks, they become eager customers for security products, verification services, and “trust solutions.” There is great profit in scaring people and then selling them something to ease the fear. A world full of uncontrolled bots is a perfect environment for that business model.

Taxing bots as a natural-law remedy

Natural law is not about writing more rules; it is about aligning human behavior with the forces that actually govern outcomes. When a tool creates external costs, the remedy is to bring those costs back to the source. That is what a tax on bots would do.

The question is not whether we can tax bots, but where in the process we place the weight. Bots touch the system at several natural choke points:

  1. At the server: Every bot must run on some machine. A usage-based tax on automated processes above a certain volume would push operators to internalize the cost of their activity.
  2. At the network: Bots consume bandwidth. Network-level metering and classification can distinguish high-volume automated traffic and apply a charge.
  3. At the platform: APIs and automated access points can require registration, identity, and a per-transaction fee for non-human traffic.
  4. At the identity layer: Large-scale bot operations can be required to obtain a license, just as other high-impact activities are licensed.

The purpose of such taxation is not to create another revenue stream for governments to waste. The purpose is to re-balance the forces: to make it uneconomical to flood the world with low-cost harm while leaving the victims to pay the bill.

“But won’t that hurt innovation?”

Whenever someone’s easy profits are threatened, the word “innovation” is brought out as a shield. The truth is simpler. Honest innovation can bear the cost of its own footprint. If a bot delivers genuine value, it can survive while paying a fair share of the costs it imposes on the shared environment.

What cannot survive under such a regime is the business model of cheap, deniable harm: the model where you can deploy a million fake voices, a million fake clicks, and a million fake trades, and let everyone else pay for the confusion and cleanup.

From numbers to forces

Most commentary on bots is numerical: how many accounts, how many attacks, how many billions of dollars in losses. I am not interested in chasing those numbers. I am interested in the forces that create them.

The forces at work here are straightforward:

  1. Automation multiplies power. One person with a botnet can project more force than a thousand honest individuals acting by hand.
  2. Low marginal cost invites abuse. When the cost of one more action is near zero, the temptation is to flood the system.
  3. Externalized costs invite irresponsibility. If others pay for your damage, you have no natural reason to stop.
  4. Fear is a commodity. A frightened population is easier to steer and easier to sell to.

Taxing bots is not about punishing technology. It is about restoring the natural connection between action and consequence. If you unleash a force into the world, you are responsible for the wake it leaves behind.

Where this leads

If bots remain effectively untaxed and unaccountable, the natural outcome is more noise, more fraud, more manipulation, and less trust. Human beings will retreat into smaller circles of confidence, and the common spaces will be left to the machines and those who control them.

If, instead, we recognize bots as carriers of negative externalities and tax them accordingly, several things happen at once. The worst abuses become uneconomical. The platforms that quietly profit from inflated numbers lose their incentive to look the other way. The honest uses of automation remain, but they must carry their own weight.

That is the natural-law position: no one has the right to shift the cost of their tools onto unwilling strangers. Bots are not an exception. They are simply the latest way to test whether we still believe that principle.

James Quillian writes on economics, politics, and natural law, focusing on the forces that create the numbers others measure

Posted in Uncategorized | Tagged artificial engagement, automation costs, bot regulation, bots, digital taxation, economic forces, natural law economics, negative externalities, platform incentives, societal burden | Leave a reply

Contact

PDF VERSIONS FOR DOWNLOADING

How U.S. Corona-Virus Agenda Came About

Organized Crime Owns Your Government

The Deep State Exposed

Your Punishment For Believing Lies

How To Train Human Beings

 

The Grazin' Is Good

Categories

Proudly powered by WordPress

This website stores cookies on your computer. These cookies are used to provide a more personalized experience and to track your whereabouts around our website in compliance with the European General Data Protection Regulation. If you decide to to opt-out of any future tracking, a cookie will be setup in your browser to remember this choice for one year.

Accept or Deny

Read In Spanish and Other Languages