Power Of Artificial Intelligence in Software


Human eye with electronics.png

 

Development: 5 Myths and Realities

Artificial intelligence offers real value, but recognizing its limitations is critical for actually capitalizing on that value.
Artificial intelligence, or AI, is one of the most intriguing topics in software development today. It is also one of the most widely misunderstood. For software developers and IT teams, AI offers an array of tantalizing possibilities for making applications faster, more scalable and more efficient. However, in many cases, the hype surrounding AI doesn’t line up with the reality of what is actually possible or practical. Toward that end, here’s a look at five common AI myths related to software development and deployment.  
1. Artificial inteligence is a new technology.
Image result for Artificial intelligence is a new technology.






AI has gained major attention from the media in just the last few years. Likewise, most software products that pitch AI as a key feature (like AIOps-driven monitoring tools) are still very new. But, AI is not a new technology at all. The concept of machine learning and artificial intelligence in general stretches back many centuries (see, for example, the Brazen Head). And software applications have been using AI to do things like play checkers since the 1950s.
Thus, if AI seems like a relatively new technology in the software world, or one that has only become practically usable in the past few years, that is only because it took the media and marketers a long time to catch up. The reality is that AI has been an established field of computer science for more than half a century.
2. AI is smarter than humans.
Image result for AI is smarter than humans.
Some AI advocates would have you believe that AI-powered applications are “smarter” than humans, in the sense that they can solve problems or develop ideas more creatively and effectively than human minds. But the reality is that AI-powered software applications don’t outthink humans. They simply think faster than humans.
And when it comes to use cases that require nuanced understanding of reasoning and human expression, AI fares particularly poorly, as IBM’s recent experiment with AI-powered debate software showed.
There is a chance that this could change in the future. Someday, AI might become so sophisticated that AI-driven applications are genuinely smarter than humans. But that day remains beyond the horizon.
3. AI will lead to smaller IT teams.
Image result for AI will lead to smaller IT teams.
Many marketers of AI-powered software tools sell their products as a way for companies to reduce the size (and, by extension, cost) of their IT teams. By using AI to automate IT decision-making and operations, they say, companies can do more with fewer staff members.
Some observers go so far as to claim that AI, combined with other innovations, is edging us closer to a world of "NoOps," wherein IT operations teams are fully replaced by software.
It's certainly true that AI can help to increase automation and reduce the manual effort required to perform certain tasks. However, the idea that AI will remove the need for human engineers entirely is fantastical. Someone still has to set up and manage the AI-powered tools that do the operations work.
Plus, there is an argument to be made that AI is not making IT operations simpler; it is merely helping IT Ops teams keep up with the ever-increasing complexity of new software and infrastructure. Deploying and managing containers and microservices requires much more work than dealing with virtual machines or bare-metal servers. In this sense, AI is simply helping IT teams to maintain the status quo; it is not empowering them to gain new ground.
4. AI software is "set and forget."
Related image
On its face, AI tools can seem like a type of "set it and forget it" wonder. If data-powered algorithms and machine learning allow AI tools to make all the decisions they need, then humans don't have to do any work beyond the initial set up and data training, right?
Well, no. There are lots of reasons why even the best-designed AI tools need to be managed actively and continuously. They need to be constantly retrained with up-to-date data in order to make accurate decisions about ever-changing conditions. The quality of the data that they rely on must be carefully managed to ensure that it delivers the level of accuracy and clarity that the tools require. Humans may need to help provide ethical guidancefor AI algorithms.
5. AI will destroy the world.
Image result for AI will destroy the world.










The four AI myths that I have discussed above involve hype or an excess of confidence in the abilities of AI and machine learning. Now, I'd like to approach things from the opposite perspective by pointing out that AI is not at all a bad or useless technology.
Sure, AI has many shortcomings, and AI tools in many cases are not likely to live up fully to the promises behind them. But that doesn't mean that AI is the bane of our existence, or that software teams should not use it at all.
This is important to note because the conversation surrounding AI has so far tended to be bipolar in nature. On one side are technologists and futurists promising us that AI will lead us into utopia. On the other are fierce AI critics worried about an AI-driven dystopia marked by all manner of dehumanizing, unethical automations.
Neither of these views represents reality. AI will not fully replace humans, but it will make their jobs easier. AI won't completely remove the need to perform manual tasks, but it will reduce it. AI won't prove smarter than human beings, but it can provide insights that help them make smarter decisions.


The Future of IoT Devices Is in Question

While IoT devices and technology were all you heard about for a while, the buzz has dimmed. Here's why and what developers need to know about the future of IoT. Until very recently, the future of IoT seemed golden, but if you feel like you've been hearing less and less about the internet of things, you're probaby right. Is it just a matter of the future of IoT becoming the mainstream present of IoT, or is the fervor around the promise of IoT devices dimming?That's a question that many IT pros have likely been asking lately. To answer it, here's a look at current trends in the IoT ecosystem, as well as the challenges that IoT developers need to solve--pronto--if IoT is to keep growing as fast as analysts promised a few years ago.
In fact, according to Google Trends, interest in IoT peaked toward the end of 2016. Interest has ebbed and flowed since then, while slowly regressing toward pre-2016 levels.
Here's Google Trends's visualization of IoT "interest over time" during the past five years:

Source: trends.google.com
Google Trends represents only one data point--and a crude one at that--about IoT's popularity. And there is no denying that the ecosystem surrounding the IoT continues to exist and remain vibrant. There is no shortage of IoT-themed conferences this calendar year, for example, and there are tens of thousands of job openings that mention IoT skills. (Although my quick analysis suggests that many of them are for relatively low-paying customer service positions, not IoT engineering jobs.)
Still, by and large, IoT devices and technology no longer seem to command quite the level of interest that they once did. In adition, predictions about IoT's economic significance are not as heady as they once were. For example, consider Nest, the IoT darling acquired by Google, which turns out not to have a very impressive balance sheet.
And, whereas in 2017 analysts were predicting that the IoT market would be worth $457 billion by 2020, sources from 2018 forecast the market reaching only $318 billion by 2023. To be sure, predictions like these are ballpark figures, at best, but it is telling that they are shrinking despite a persistently strong economic climate during the past several years.
Persistent Challenges for IoT Devices and Technology
What explains the declining--or at least plateauing--interest in IoT? Here are a few of the major challenges that the IoT ecosystem must solve to achieve the dizzying growth rates that analysts promised circa 2016, at the height of IoT fever.
1. IoT Network Limitations
All IoT devices are connected to the Internet some of the time, and some IoT devices are connected all of the time. But not all IoT devices are actually online all of the time. That's because many devices are located in places where connectivity or bandwidth are limited. There might not be regular cellular service, much less a wired network. Instead, they rely on things like non-cellular long-range wireless networks with low bandwidth or spotty service.
Some IoT devices also simply lack the power to be able to maintain a continuous Internet connection, even if standard networking is available to them.
If you think that these challenges will go away soon enough as network infrastructure expands or is upgraded, think again. Networking expansions are quite costly, and it could take many more decades for reliable network infrastructure to reach remote parts of the globe.
2. IoT Security Problems
It's relatively easy to deploy an IoT device; it is much harder to secure it and the data it sends and receives.
Indeed, examples of poorly secured IoT devices proliferate. Compared to PCs and smartphones, IoT devices and software have a long way to go to reach a reliable level of security.
The problem in this regard is not technological as much as it is cultural. Developers know how to secure IoT devices in most cases, but they sometimes simply don't bother because there is not a decades-old culture of IoT security as there is for PC security. Until relatively late in IoT's history, no one was trying to hack baby heart monitors or home security cameras. As a result, security has not been a paramount concern for IoT developers--at least, not in the same way that it has been for web or PC developers.
Or, in some cases, IoT developers do things like set publicly known default passwords on each device as a convenience to users, even though they know it is a security risk.
The culture surrounding security on IoT devices needs to change if the IoT is to keep growing at a fast rate.
3. Closed IoT Ecosystems and Proprietary Standards
A large part of the reason why PCs and smartphones have become ubiquitous is that they are driven by open standards. No matter who manufactured your laptop or your phone, you can use it to run almost all of the same applications or visit the same websites as you can from any other device, thanks to networking and programming standards that virtually all mainstream PCs and phones follow.
The same is not true with IoT. While organizations like the Open Connectivity Foundationare working hard to advance open standards for IoT, such standards are a long way from achieving universal adoption in the fragmented, proprietary IoT ecosystem. There is no Android operating system or Chome Web browser for IoT. From this perspective, IoT today looks like computers did in the 1960s, long before a handful of operating system architectures, programming languages and networking protocols homogenized the market.
Eventually, I suspect that open standards will win over IoT. The history of computing suggests that openness and standardization usually prevail in the end within most digital realms. However, this change isn't likely to come about within the next year or two; it may take decades.
Conclusion
Will we still be talking about IoT five, 10 or 20 years from now? I think so. But I'm not convinced that IoT is on the path to achieving the level of greatness that IoT proponents were predicting a couple of years ago. To do that, IoT will need to solve some big problems in areas like networking, security and standardization--at a rate that is faster than it can probably achieve

Comments

Popular posts from this blog

How to use Django Bootstrap Modal Forms

Everything you need to know when developing an on demand service app

Documentation is Very vital before you develop any system or app