Author Archives: syedraza089

Artificial Intelligence and its future

Artificial intelligence (AI) is intelligence demonstrated by machines, as opposed to natural intelligence displayed by animals including humans. Leading AI textbooks define the field as the study of “intelligent agents”: any system that perceives its environment and takes actions that maximize its chance of achieving its goals, Some popular accounts use the term “artificial intelligence” to describe machines that mimic “cognitive” functions that humans associate with the human mind, such as “learning” and “problem solving”, however, this definition is rejected by major AI researchers.

AI in healthcare:

AI in healthcare has gained significant traction during the pandemic. AI has taken on a much bigger role in healthcare during the pandemic. The White House partnered with AI research institutions to mine scientific literature to better understand Covid-19. Biotech companies and big tech players leveraged AI to understand the structure of the novel coronavirus to expedite drug discovery. Social distancing and lockdown measures forced medical labs to accelerate their digital pathology capabilities.   Amid economic uncertainties, healthcare AI companies raised recording funding in Q3’20.

AI will change healthcare by 2030:

This article is part of the World Economic Forum Annual Meeting. By 2030, AI will access multiple sources of data to reveal patterns in disease and aid treatment and care. Healthcare systems will be able to predict an individual’s risk of certain diseases and suggest preventive measures. AI will help reduce waiting times for patients and improve efficiency in hospitals and health systems. The first big consequence of this in 2030 is that health systems are able to deliver truly proactive, predictive healthcare.

AI and the Future of Work:

A recent study from Redwood Software and Sapio Research underscores this view. Participants in the 2017 study said they believe that 60 percent of businesses can be automated in the next five years. On the other hand, Gartner predicts that by 2020 AI will produce more jobs than it displaces. Dennis Mortensen, CEO and founder of x.ai, maker of AI-based virtual assistant Amy, agreed. “I look at our firm and two-thirds of the jobs here didn’t exist a few years ago,” said Mortensen.

In addition to creating new jobs, AI will also help people do their jobs better — a lot better. At the World Economic Forum in Davos, Paul Daugherty, Accenture’s Chief Technology and Innovation Officer summed this idea up as, “Human plus machine equals superpowers.” 

Potential to transform businesses and contribute to economic growth:

These technologies are already generating value in various products and services, and companies across sectors use them in an array of processes to personalize product recommendations, find anomalies in production, identify fraudulent transactions, and more. The latest generation of AI advances, including techniques that address classification, estimation, and clustering problems, promises significantly more value still. An analysis we conducted of several hundred AI use cases found that the most advanced deep learning techniques deploying artificial neural networks could account for as much as $3.5 trillion to $5.8 trillion in annual value, or 40 percent of the value created by all analytics techniques. 

Deployment of AI and automation technologies can do much to lift the global economy and increase global prosperity, at a time when aging and falling birth rates are acting as a drag on growth. Labor productivity growth, a key driver of economic growth, has slowed in many economies, dropping to an average of 0.5 percent in 2010–2014 from 2.4 percent a decade earlier in the United States and major European economies, in the aftermath of the 2008 financial crisis after a previous productivity boom had waned. AI and automation have the potential to reverse that decline: productivity growth could potentially reach 2 percent annually over the next decade, with 60 percent of this increase from digital opportunities.

Work Cited:

Artificial intelligence – Wikipedia

The Role of Artificial Intelligence in the Future of Work | Blogs | CDC

AI, automation, and the future of work: Ten things to solve for (Tech4Good) | McKinsey

From the blog CS@Worcester blog – Syed Raza by syedraza089 and used with permission of the author. All other rights reserved by the author.

Where Programming, AI and Cloud are headed in 2021?

This study is based on title usage on O’Reilly online learning. The data includes all usage of O’reilly platform, not just content that O’Reilly has published, and certainly not just books. I have included search data in the graphs.

Programming languages:

In the first figure, you can see year-over-year growth in usage, and the number of search queries for several popular languages. The top programming languages according to O’reilly are Python (up 27%), Java (down 3%), C++ (up 10%), C (up 12%), and JavaScript (up 40%). Looking at 2020 usage rather than year-over-year changes, it’s surprising to see JavaScript so far behind Python and Java. (JavaScript usage is 20% of Python’s, and 33% of Java’s).

Past the top five languages, Go has grown 16% and Rust has grown 94%. Go has clearly established itself, particularly as a language for concurrent programming, and Rust is likely to establish itself for “system programming”

Figure 2 shows what happens when you add the use of content about Python, Java, and JavaScript to the most important frameworks for those languages.

Adding usage and search query data for Spring (up 7%) reverses Java’s apparent decline (net-zero growth). Looking further at JavaScript, if you add in usage for the most popular frameworks (React, Angular, and Node.js), JavaScript usage on O’Reilly online learning rises to 50% of Python’s, only slightly behind Java and its frameworks.

None of these top languages are going away, though their stock may rise or fall as fashions change and the software industry evolves.

We see several factors changing pro‐ gramming in significant ways:

Multi Paradigm languages:

Since last year, According to O’reilly there is a 14% increase in the use of content on functional programming. Object oriented programming is up even more than 29% growth as compared to functional programming since last year. Starting with Python 3.0 in 2008 and continuing with Java 8 in 2014, programming languages have added higher-order functions (lambdas) and other “functional” features. Several popular languages (including JavaScript and Go) have had functional features from the beginning. This trend started over 20 years ago (with the Standard Template Library for C++), and we expect it to continue.  

Concurrent programming:

Platform data for concurrency shows an 8% year-over-year increase.Java was the first widely used language to support concurrency as part of the language.Go, Rust, and most other modern languages have built-in support for concurrency. Concurrency has always been one of Python’s weaknesses.

Dynamic versus static typing:

The distinction between languages with dynamic typing (like Ruby and JavaScript) and statically typed languages (like Java and Go) is arguably more important than the distinction between functional and object-oriented languages. Python 3.5 added type hinting, and more recent versions have added additional static typing features. TypeScript, which adds static typing to JavaScript, is coming into its own (12% year-over-year increase).

Low-code and no-code computing: 

low-code is real and is bound to have an effect.Spreadsheets were the forerunner of low-code computing. When VisiCalc was first released in 1979, it enabled millions to do significant and important computation without learning a programming language. Democratization is an important trend in many areas of technology; it would be surprising if programming were any different.

AI, Machine Learning, and Data:

Healthy growth in artificial intelligence has continued: machine learning is up 14%, while AI is up 64%; data science is up 16%, and statistics is up 47%. While AI and machine learning are distinct concepts, there’s enough confusion about definitions that they’re frequently used interchangeably. We informally define machine learning as “the part of AI that works”; AI itself is more research oriented and aspirational. If you accept that definition, it’s not surprising that content about machine learning has seen the heaviest usage: it’s about taking research out of the lab and putting it into practice. It’s also not surprising that we see solid growth for AI, because that’s where bleeding-edge engineers are looking for new ideas to turn into machine learning.

It’s possible that AI (along with machine learning, data, big data, and all their fellow travelers) is descending into the trough of the hype cycle. We don’t think so, but we’re prepared to be wrong. As Ben Lorica has said (in conversation), many years of work will be needed to bring current research into commercial products. 

The future of AI and machine learning:

  • Many companies are placing significant bets on using AI to automate customer service. We’ve made great strides in our ability to synthesize speech, generate realistic answers, and search for solutions.
  •  We’ll see lots of tiny, embedded AI systems in everything from medical sensors to appliances to factory floors. Anyone interested in the future of technology should watch Pete Warden’s work on TinyML very carefully.
  • Natural language has been (and will continue to be) a big deal. GPT-3 has changed the world. We’ll see AI being used to create “fake news,” and we’ll find that AI gives us the best tools for detecting what’s fake and what isn’t.

Web Development:

Since the invention of HTML in the early 1990s, the first web servers, and the first browsers, the web has exploded (or degenerated) into a proliferation of platforms. Those platforms make web development infinitely more flexible: They make it possible to support a host of devices and screen sizes. They make it possible to build sophisticated applications that run in the browser. And with every new year, “desktop” applications look more old-fashioned.

The foundational technologies HTML, CSS, and JavaScript are all showing healthy growth in usage (22%, 46%, and 40%, respectively), though they’re behind the leading frameworks. We’ve already noted that JavaScript is one of the top programming languages—and the modern web platforms are nothing if not the apotheosis of JavaScript. Twenty-five years later, that’s no longer true: you can still “view source,” but all you’ll see is a lot of incomprehensible JavaScript. Ironically, just as other technologies are democratizing, web development is increasingly the domain of programmers.

Clouds of All Kinds:

It’s no surprise that the cloud is growing rapidly. Usage of content about the cloud is up 41% since last year. Usage of cloud titles that don’t mention a specific vendor (e.g., Amazon Web Services, Microsoft Azure, or Google Cloud) grew at an even faster rate (46%). Our customers don’t see the cloud through the lens of any single platform. We’re only at the beginning of cloud adoption; while most companies are using cloud services in some form, and many have moved significant business-critical applications and datasets to the cloud, we have a long way to go. If there’s one technology trend you need to be on top of, this is it.

The horse race between the leading cloud vendors, AWS, Azure, and Google Cloud, doesn’t present any surprises. Amazon is winning, even ahead of the generic “cloud”—but Microsoft and Google are catching up, and Amazon’s growth has stalled (only 5%). Use of content about Azure shows 136% growth—more than any of the competitors—while Google Cloud’s 84% growth is hardly shabby. When you dominate a market the way AWS dominates the cloud, there’s nowhere to go but down. But with the growth that Azure and Google Cloud are showing, Amazon’s dominance could be short-lived.

While our data shows very strong growth (41%) in usage for content about the cloud, it doesn’t show significant usage for terms like “multi cloud” and “hybrid cloud” or for specific hybrid cloud products like Google’s Anthos or Microsoft’s Azure Arc. These are new products, for which little content exists, so low usage isn’t surprising. But the usage of specific cloud technologies isn’t that important in this context. usage of all the cloud platforms is growing, particularly content that isn’t tied to any vendor. We also see that our corporate clients are using content that spans all the cloud vendors; it’s difficult to find anyone who’s looking at a single vendor.

Security and Privacy:

Security has always been a problematic discipline: defenders have to get thousands of things right, while an attacker only has to discover one mistake. And that mistake might have been made by a careless user rather than someone on the IT staff. 

CISSP content and training is 66% of general security content, with a slight (2%) decrease since 2019. Usage of content about the CompTIA Security+ certification is about 33% of general security, with a strong 58% increase. 

It’s disappointing that we see so little interest in content about privacy, including content about specific regulatory requirements such as GDPR. We don’t see heavy usage; we don’t see growth; we don’t even see significant numbers of search queries.

Work cited: 

Google

Where Programming, Ops, AI, and the Cloud are Headed in 2021 – O’Reilly (oreilly.com)

From the blog CS@Worcester blog – Syed Raza by syedraza089 and used with permission of the author. All other rights reserved by the author.

Introduction

Hello, My name is Syed Minhal Raza. My major is Computer Science and I am a junior student at Worcester State University

From the blog CS@Worcester blog – Syed Raza by syedraza089 and used with permission of the author. All other rights reserved by the author.