Tech

6 ways automation makes things worse

[ad_1]

Who hasn’t dreamed of gently idling by a pool while AIs and low-code tools keep the enterprise stack running smoothly? Perhaps on a whim, we’ve decided to redesign a section of our latest popular web app. Without leaving the poolside, we utter a few commands. The code we need is generated and launched perfectly, with everything done right. And that’s it: our work for the quarter is done. Now we can really relax.

That’s a nice vision, but the present reality is more like a cold splash in the face. None of the existing tools work well enough to execute without considerable human interaction. Oh, they get it right sometimes. A code completion just works, or an automated sequence adjusts server-load parameters on the fly. We can be grateful for these moments when useful tools make our lives easier.

But such tools and automations fail, and when they do, the results range from inconvenient to catastrophic. This morning, I spent time on the phone with my domain registrar because a simple change to a DMARC record wasn’t sticking. The web application claimed the change was made but the system had not shared the new DNS value with the world. No matter what admins tried, the system would not budge. So, I’m looking for a new registrar while tech support tries to figure out what’s going on.

For every wonderful thing automation does, there’s an equal and opposite example of how it’s screwed up. Automation works most of the time, so the relationship isn’t completely symmetrical. But it’s just when you take your hands off the wheel (or go on vacation) that these systems seem to go rogue.

Here are six ways that automation can go off the rails.

Garbage collection

In theory, memory allocation is not something that human geniuses should have to worry our little heads about. Most modern languages have a layer that doles out chunks of memory and then sweeps them up when the data they contain is no longer needed. A good garbage collector, so the story goes, can free up programmers to think of bigger and more important things, like the value of their stock options.

Because garbage collection is automatic, you might expect that memory leaks would be a thing of the past. They’re certainly less common than they used to be. But programmers can still allocate blocks of data in a way that the garbage collector ignores them. What’s worse, programmers don’t think it’s their responsibility to worry about memory leaks anymore; that’s a job for the garbage collector! So, instead of looking for the misallocated data, they’ll just increase the amount of RAM in the cloud server. How much of the cloud’s RAM is filled with data structures that should have been freed up?

There are other issues with automated memory management. Object allocation is one of the biggest time sinks for code, and smart programmers have learned that code runs faster if they allocate one object at the start of the program and then keep reusing it. In other words, set things up so the garbage collector doesn’t do anything.

As a more general problem, why does garbage collection always seem to happen at the most inconvenient time? The automation routines ensure it kicks right in, and the garbage collector has no way of knowing (or caring) whether the latency and lag will ruin the user experience. Developers who create user interfaces or code that needs to run in, say, medical hardware, have good reason to worry about the timing of garbage collection

Interpreted code

The various scripting languages have opened up coding and made it simpler to just knock off a few lines of code. Their relative simplicity and basic approach has won over many fans, not only among full-time programmers but also in related fields like data science. There’s a reason why Python is now one of the most commonly taught programming languages.

Still, the automation that makes these interpreted languages easier to use can also bring inefficiencies and security issues. Interpreted languages are usually slower, sometimes dramatically so. The combination of automated memory management, little time for optimization, and the general slog of runtime interpretation can really slow down your code.

Leveraging the power of a good Just-in-Time (JIT) compiler can make things better. Python developers might use Jython, which has a built-in Java-based JIT compiler. PHP and Python also have their own JIT compilers—PyPy, Numba, and Pyston, to name a few—but there are still limits to what the interpreter can do.

Some say that interpreted code is less secure. The compilers might then spend extra time scrutinizing the code while the interpreter goes in the opposite direction, striving to keep its results “just in time.” Also, the dynamic typing popular with interpreted languages can make it easier to run injection attacks or other schemes. Of course, compiled code is just as vulnerable sometimes. All programmers need to be vigilant, no matter what language they’re using.

Artificial intelligence

Artificial intelligence is a much bigger topic than automation, and I’ve discussed the various dark secrets and limitations of AI elsewhere. While AIs may be celebrated as modern miracles that are better than anyone expected, their output often has a bland, regurgitated feel once the novelty wears off. That makes sense because large language models (LLMs) are essentially massive averages of their training set.

Sometimes, AI makes things worse by tossing out random errors. The system is machine-gunning grammatically perfect sentences and well-structured paragraphs until—wait, what?—it suddenly hallucinates a made-up fact. To make matters worse, AI sometimes tosses out slander, libel, and calumny about living, breathing, and potentially litigious individuals. Whoops.

The best use of AIs seems to be as a not-so-smart assistant for smarter, more agile humans, who can keep this automated genius on a tight leash.

Database queries

In theory, databases are the original automated tool that can keep all our bits in nice, structured tables and answer our questions anytime we want. Oracle even slapped the label “autonomous” on its database to emphasize just how automated everything was. The modern enterprise couldn’t run without the magic of big databases. We need their raw power. It’s just that development teams quickly learn their limitations.

Sometimes fancy query engines are too powerful for their own good, such as when programmers create queries that take forever to complete. Writing simple SQL queries isn’t especially hard, but it can be very difficult to write a complex query that is also efficient. All the automation expended in storage and retrieval gives developers just enough rope to tie up their code in knots.

Some teams can afford to hire specialized database administrators to keep the bits flowing smoothly. These specialists will tune the parameters and ensure there’s enough RAM to handle the indices without thrashing. When it’s time to create an SQL query with more than one clause, they know how to do it intelligently, so that the machine doesn’t grind to a halt.

Low-code platform automation

Some enterprise tools, portals, and web applications are now sophisticated enough to adapt themselves on the fly, with little or no new programming. Sales teams like to call this feature “low code” or even “no code.” It’s not inaccurate because the level of automation is pretty slick. But there are still some headaches bundled into the package.

The biggest problem is the same one that confronts the clothing industry, where customers know that “one size fits all” really means “one size fits none.” Each enterprise is a bit different, so each data warehouse, processing pipeline, and interface should also be different. Low-code and no-code options, though, offer one generalized system. Any customizations tend to be skin-deep.

This generalized code is often much slower because it has to be ready for anything. It’s constantly checking the data before formatting and reformatting it. All the glue code that automatically connects needs to run, often each and every time new data arrives. This boosts the costs of hardware and sometimes slows everything down.

Many teams will make even slow automation work because it’s easier and much cheaper than staffing a project to build the stack. But this means living with something that doesn’t really fit and often is just a bit pokier and more expensive to run.

Workflow automation (RPA)

A cousin of low-code and no-code development is RPA, or robotic process automation. Keep in mind that there aren’t any movie-grade robots in sight. The tools have found a home in offices because they’re useful for applying AI to common clerical tasks like juggling documents. Unfortunately, these tools have all the potential problems of both AI and low code.

A big selling point of RPAs is that they can put a modern interface on legacy stacks while also adding a bit of integration. This can be a fast way to put up a pretty face without changing any of the old code. Of course, it also means the old code doesn’t get updated or rewritten to modern standards, so the insides are stuffed with data structures and algorithms that date to the era of punch cards. RPA is like slapping technical duct tape on code that barely runs.

The real danger comes when the software works well enough to lull humans to sleep. Automation takes care of the manual steps that might otherwise give a human processor time to notice whether there’s something wrong with an invoice or order. Now, some manager just logs in and clicks the “approve all” button. Slowly the fraud and mistakes start to add up, as the checks and balances of traditional office procedures erode. The one person left—part-time, of course—lacks the tools and insight to understand what is happening before it is too late.

Zero automation

The only thing worse than adding more automation is adding none at all. The technical debt just never gets fixed. The software stack gets so outdated that it’s not worth upgrading anymore. As the stack slowly ossifies, so does everyone in the office. The enterprise is stuck doing things the same way they’ve always been done. The software stack rules the workflow.

It’s well and good to complain and take note of how software automation fails, but sometimes the best thing is to just use what we know about the pitfalls to plan strategically. In other words, factor in the downsides while trying to avoid them or find a better solution. The only thing worse than progress is no progress at all.

Copyright © 2023 IDG Communications, Inc.

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button