0% found this document useful (0 votes)
15 views20 pages

LIT Quiz 03 Science and Technology - Article5573496

Uploaded by

mahendra2356783
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views20 pages

LIT Quiz 03 Science and Technology - Article5573496

Uploaded by

mahendra2356783
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

LegalEdge Classroom Handout

Part of the Most Comprehensive & Consistently Successful Study Material & Test Series Module, spanning
across both Offline and Online Programs in the entire Country. As a result LegalEdge was able to engineer
Clean-Sweep-Landslide figures of a handsome 64 Selections & 65 Selections in Top 100 (including AIR 1, 2 & 3
from Classroom Contact Programs in 2023, 2022 & 2021) & a whopping 273 selections & 327 selections in Top
500, in CLAT 2021 & CLAT 2022, respectively. With AILET being no different, a total of 34 of our students found
their way into NLU, Delhi in 2021 & 35 in 2022. In a nutshell, every second admit in a Top National Law School
in 2021 & 2022 came from the LegalEdge Preparation Ecosystem.

ENGLISH LANGUAGE

c o m
LANGUAGE IMPROVEMENT TOOLKIT (LIT) QUIZ-03

rs .
BASED ON SCIENCE AND TECHNOLOGY
READING MATERIAL

k e
1.

r an
Article 01 - Climate change will shift the oceans’ colors

o p tr-5S5T7L3P4T
9O6U

The color of deep blue oceans, shallow turquoise waters, and emerald green coasts is quickly changing as the
planet warms, according to new research published in the journal Nature.

T
Analyzing 20 years of satellite data, the study's authors found that over half the world's ocean, 56 percent,
experienced a shift in color. The cause? Changes in the density and distribution of plankton. These tiny
organisms contain chlorophyll,
tr-5S5T7L3P4T
9O6U the bright green pigment that helps plants make food from sunlight.

The recent study supports a similar prediction made by a Nature Communications study published in 2019 that
modeled how phytoplankton will change as oceans continue to warm.

And while the new study used satellites to detect subtle changes in color, the prior research predicted
significant changes by 2100, if the world keeps warming at its current pace.

Under a “business-as-usual” scenario in which greenhouse gas emissions continue unabated, the bluest
subtropical zones of the ocean will become bluer, and greener regions along the equator and poles will become
greener, that study found. 6G 9K
tr-5I5H7I3E4E
More than just an oddity, the changing color is a warning sign, say the 2019 study authors, of drastic global
changes that will take place in a world warmed by climate change.

How the ocean gets its color


Sunlight penetrates over 600 feet below the surface of the ocean. Everything deeper is enshrined in darkness.
Above that, most water molecules are capable of absorbing all colors except blue, which is why blue is
reflected out.
Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 1 of 20
Organic matter that blankets the surface of the ocean, like phytoplankton, changes this color. As the ocean
warms, currents become more irregular, and the layers in the water become more stratified, meaning warm
regions don't mix as easily with cold regions.

There are thousands of phytoplankton species, uniquely adapted to warm or cold water. As oceans continue
warming, some species may die off, some will thrive, and others will migrate to different regions.

But just looking at chlorophyll alone, however, won't tell scientists how a warming climate is altering
phytoplankton. Naturally occurring events like El Niños and La Niñas can influence how much phytoplankton
is concentrated in a given area

Stephanie Dutkiewicz, an author on both papers and marine ecologist at the Massachusetts Institute of

m
Technology, said in 2019 that models used to predict future changes in color factor phytoplankton life cycles
and movements into naturally occurring ocean patterns.

c o
The 2023 study revealed that many of these predicted changes have already occurred. Using light-measuring

.
devices aboard NASA satellites, scientists observed that over half of the world covered by ocean already

rs
showed a measurable shift in blue and green wavelengths, an approximation for the amount of chlorophyll in a
given region.

What do these changing colors mean?

k e
n
It's too early to say for sure what effect these changing colors will have on the environment, but scientists think

a
more ecosystems could be dominated by smaller-sized plankton in the future, according to a press release from

r
the National Oceanography Center in the U.K., which supported the 2023 study.

The ocean has absorbed about a third of

o p the 5T7L3P4T9O6U
tr-5Sworld's carbon
emissions, and marine life like kelp, seagrass, and
algae play a critical role in helping pull that carbon out of the atmosphere.

told
5S5TNational
T
But smaller algae could reduce that climate change-fighting power.

“Phytoplankton are the base of the marine food web. Everything in the ocean requires phytoplankton to exist,"
Dutkiewicz tr- Geographic in 2019. "The impact will be felt all the way up the food chain."
7L3P4T9O6U

2. Article 02 - What is the Big Bang Theory?


The Big Bang Theory is the leading explanation for how the universe began. Simply put, it says the universe as
we know it started with an infinitely hot and dense single point that inflated and stretched — first at
unimaginable speeds, and then at a more measurable rate — over the next 13.7 billion years to the still-
expanding cosmos that we know today. Existing technology doesn't yet allow astronomers to literally peer back
at the universe's birth, much of what we understand about the Big Bang comes from mathematical formulas
and models. Astronomers can, however, see the "echo" of the expansion through a phenomenon known as the
cosmic microwave background. While the majority of the astronomical community accepts the theory, there
are some theorists who have alternative explanations
9K 6G besides the Big Bang — such as eternal inflation or a
tr-5I5H7I3E4E
vacillating universe.

THE BIG BANG: THE BIRTH OF THE UNIVERSE


Around 13.7 billion years ago, everything in the entire universe was condensed in an infinitesimally small
singularity, a point of infinite denseness and heat.

Suddenly, an explosive expansion began, ballooning our universe outwards faster than the speed of light. This
was a period of cosmic inflation that lasted mere fractions of a second — about 10-32 of a second, according to
physicist Alan Guth’s 1980 theory that changed the way we think about the Big Bang forever.
Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 2 of 20
When cosmic inflation came to a sudden and still-mysterious end, the more classic descriptions of the Big
Bang took hold. A flood of matter and radiation, known as "reheating," began populating our universe with the
stuff we know today: particles, atoms, the stuff that would become stars and galaxies and so on.

This all happened within just the first second after the universe began, when the temperature of everything was
still insanely hot, at about 10 billion degrees Fahrenheit (5.5 billion Celsius), according to NASA. The cosmos
now contained a vast array of fundamental particles such as neutrons, electrons and protons — the raw
materials that would become the building blocks for everything that exists today.

This early "soup" would have been impossible to actually see because it couldn't hold visible light. "The free
electrons would have caused light (photons) to scatter the way sunlight scatters from the water droplets in
clouds," NASA stated. Over time, however, these free electrons met up with nuclei and created neutral atoms

m
or atoms with equal positive and negative electric charges.

This allowed light to finally shine through, about 380,000 years after the Big Bang.

c o
Sometimes called the "afterglow" of the Big Bang, this light is more properly known as the cosmic microwave

.
background (CMB). It was first predicted by Ralph Alpher and other scientists in 1948 but was found only by

rs
accident almost 20 years later.

k e
This accidental discovery happened when Arno Penzias and Robert Wilson, both of Bell Telephone
Laboratories in New Jersey, were building a radio receiver in 1965 and picked up higher-than-expected

n
temperatures, according to a NASA article. At first, they thought the anomaly was due to pigeons trying to

a
roost inside the antenna and their waste, but they cleaned up the mess and killed the pigeons and the anomaly
persisted.

r
Simultaneously, a Princeton University

o p team led7L3P4T9O6U
by Robert
Dicke was trying to find evidence of the CMB and
tr-5S5T
realized that Penzias and Wilson had stumbled upon it with their strange observations. The two groups each
published papers in the Astrophysical Journal in 1965.

tr-5S5TTheory
T
Has the Big Bang Theory been proven?
This isn't really a statement that we can make in general. The best we can do is say that there is strong evidence
for the Big Bang 7L3P4T9Oand
6U that every test we throw at it comes back in support of the theory. Mathematicians

prove things, but scientists can only say that the evidence supports a theory with some degree of confidence
that is always less than 100%.

So, a short answer to a slightly different question is that all of the observational evidence that we've gathered is
consistent with the predictions of the Big Bang Theory. The most important observations are:

1) The Hubble Law shows that distant objects are receding from us at a rate proportional to their distance —
which occurs when there is uniform expansion in all directions. This implies a history where everything
was closer together.
9K6G
tr-5I5H7I3E4E
2) The properties of the cosmic microwave background radiation (CMB). This shows that the universe went
through a transition from an ionized gas (a plasma) and a neutral gas. Such a transition implies a hot, dense
early universe that cooled as it expanded. This transition happened after about 400,000 years following the
Big Bang.

When was the Big Bang Theory established?


Who came up with the idea?

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 3 of 20
Hubble was really the person who set up the observations. Evidence continued to mount, especially in the
1970s with the detection of the CMB. The term "Big Bang" was first used in the late 1940s by the astronomer
Fred Hoyle — eventually, it caught on in the 1970s.

"We are trying to do something like guessing a baby photo of our universe from the latest picture," study leader
Masato Shirasaki, a cosmologist at the National Astronomical Observatory of Japan (NAOJ), wrote in an email
to our sister website Live Science.

Other researchers have chosen different paths to interrogate our universe's beginnings.

In a 2020 study, researchers did so by investigating the split between matter and antimatter. In the study, not
yet peer-reviewed, they proposed that the imbalance in the amount of matter and antimatter in the universe is
related to the universe's vast quantities of dark matter, an unknown substance that exerts influence over gravity

m
and yet doesn't interact with light. They suggested that in the crucial moments immediately after the Big Bang,
the universe may have been pushed to make more matter than its inverse, antimatter, which then could have led
to the formation of dark matter.

.c o
Examining the CMB also gives astronomers clues as to the composition of the universe. Researchers think

rs
most of the cosmos is made up of matter and energy that cannot be "sensed" with our conventional
instruments, leading to the names "dark matter" and "dark energy." It is thought that only 5% of the universe is
made up of matter such as planets, stars and galaxies.

k e
n
While astronomers study the universe's beginnings through creative measures and mathematical simulations,

a
they've also been seeking proof of its rapid inflation. They have done this by studying gravitational waves, tiny

r
perturbations in space-time that ripple outwards from great disturbances like, for instance, two black holes
colliding, or the birth of the universe.

o p 6U
tr-5S5T7L3P4T
9O

As the universe expanded, it created the CMB and a similar "background noise" made up of gravitational
waves that, like the CMB, were a sort of static, detectable from all parts of the sky. Those gravitational waves,

of which is called "B-modes."

However, since
tr-5S5Tthen
multiple times.
T
according to the LIGO Scientific Collaboration, produced a theorized barely-detectable polarization, one type

7L3P4Tgravitational
9O6U waves have not only been confirmed to exist, they have been observed

THE UNIVERSE'S CONTINUED EXPANSION


The universe is not only expanding, but expanding faster. This means that with time, nobody will be able to
spot other galaxies from Earth or any other vantage point within our galaxy.

"We will see distant galaxies moving away from us, but their speed is increasing with time," Harvard
University astronomer Avi Loeb said in a March 2014 Space.com article.

"So, if you wait long enough, eventually, a6G


9K distant galaxy will reach the speed of light. What that means is that
tr-5I5H7I3E4E
even light won't be able to bridge the gap that's being opened between that galaxy and us. There's no way for
extraterrestrials on that galaxy to communicate with us, to send any signals that will reach us, once their galaxy
is moving faster than light relative to us."

Some physicists also suggest that the universe we experience is just one of many. In the "multiverse" model,
different universes would coexist with each other like bubbles lying side by side. The theory suggests that in
that first big push of inflation, different parts of space-time grew at different rates. This could have carved off
different sections — different universes — with potentially different laws of physics.

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 4 of 20
"It's hard to build models of inflation that don't lead to a multiverse," Alan Guth, a theoretical physicist at the
Massachusetts Institute of Technology, said during a news conference in March 2014 concerning the
gravitational waves discovery. (Guth is not affiliated with that study.)

"It's not impossible, so I think there's still certainly research that needs to be done. But most models of inflation
do lead to a multiverse, and evidence for inflation will be pushing us in the direction of taking [the idea of a]
multiverse seriously."

While we can understand how the universe we see came to be, it's possible that the Big Bang was not the first
inflationary period the universe experienced. Some scientists believe we live in a cosmos that goes through
regular cycles of inflation and deflation, and that we just happen to be living in one of these phases.

3. Article 03 - Exploring the Metaverse

m
A piece of art made there over to Fortnite. In this vision, users can monetize their digital assets, selling, renting,
or even borrowing against them.

c o
The message, it seems, is that while users got the short end of the stick on the old web, where they traded their

.
data for free search engines and social media platforms, they (or, rather, the architects of this new web) are

rs
renegotiating that deal. “Play becomes labor that produces assets worth something within that dApp (or even in
the broader metaverse),” write Hackl, Lueth, and Di Bartolo. That might involve creating monsters in the game

k e
Axie Infinity and selling them to other players or earning tokens with them, freelancing as a brand ambassador
in Decentraland, or hawking digital art or avatar gear. Instead of the dopamine hit of likes, the rewards of
online life come in cold, hard crypto.

r an
It’s an exciting pitch—because the old web leaves a lot to be desired. The ad-based model makes users’
information the product; a few giant companies have so6Umuch power that they’re almost impossible to regulate;

o
tr-5S5T7L
and the endless drive for engagement promotes

p
3P4T9O
divisive content, conspiracy theories, and trolling. All of which
makes spending time on social media seem like a light vice: I for one talk about Twitter as if it’s a casual
smoking habit I can’t give up. An alternative that could break up some of the entrenched power and

T
reinvigorate the web should be welcome news.

Yet I can’t help seeing the dystopian side of this future. Work isn’t becoming play; play is becoming work. It
feels as if instead
tr-5S5T7Lof offering
3P4T 9O6U digital liberation and ownership, the metaverse is offering more responsibilities
without a promotion. Do I want to bring everything I do in my free time to work with my avatar, dragging all
my other interests and relationships along with me? Do I want to turn my leisure activity into a small business?
And do I want to spend even more of my life online? Or have my online life supplant my humble one in the
physical world? Exploring the Metaverse

With the world what it is these days, you can see why people might be itching for an alternate reality—a way
to reboot the system and start fresh. That’s the appeal of virtual realms: They’re places where power can be
inverted, disappointments escaped, and capitalist inequities left behind for something more exciting, malleable,
and meaningful.
9K6G
tr-5I5H7I3E4E
It’s no wonder, then, that online universes like Fortnite and Roblox currently attract nearly 400 million users,
and others like Decentraland and the Sandbox are growing rapidly. The market for them will soon be worth
more than $1 trillion, estimates show. Facebook has changed its name to Meta to signal its belief in a virtual
future. Microsoft is preparing for workplaces populated by digital avatars. Fashion brands from Nike to Gucci
are designing clothes and accessories for the metaverse. J.P. Morgan and Samsung have set up shop in
Decentraland. On Roblox players can operate their own Forever 21 stores and even sell their own designs in
them. Many companies are making big bets on the metaverse (even if most people still aren’t quite sure what it
is).

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 5 of 20
Three new books help explain why. Navigating the Metaverse, by Cathy Hackl, Dirk Lueth, and Tommaso Di
Bartolo; The Metaverse Handbook, by QuHarrison Terry and Scott Keeney; and Step into the Metaverse, by
Mark van Rijmenam, all set themselves up as Lonely Planet guides to the digital frontier.

While definitions of it vary, here are some basics about the metaverse: It’s actually many metaverses, or digital
spaces, which typically are decentralized, incorporate augmented and virtual reality, store information on
blockchain, and allow users to own digital goods. So like “the internet,” the term “the metaverse” describes a
sprawling network of sites and spaces.

In practice the metaverse offers a new way to be online, with new markets and products. In their book, Hackl,
Lueth, and Di Bartolo state that it presents three paradigm shifts:
1. Experience: People don’t just want to consume. It’s far more engaging to have gamified, contextual
experiences.

m
2. Identity: People value their digital persona and want to carry it with them across the metaverse and even
into the real world.
3. Ownership: Wherever people choose to spend their time, they want skin in the game.

.c o
In other words the endgame is to have a unified digital identity on blockchain—an identity that’s the same

rs
whether you’re signing in to your work computer or gaming at night. It will contain the keys to your crypto,
the NFTs you bought for your digital house in Decentraland, and all your other important data. In the
metaverse you’re less a user than you are a member.

k e
Roblox, where she threw a New Year’s Eve celebration
r an
This opens a whole new world of possibilities. Terry and Keeney point to Roblox as an example of what’s to
come. On it players design games and spaces, and people gather for events in a way that they can’t on social
media sites. Keeney (who is also known as “DJ Skee”) worked with Paris Hilton to build Paris World on
that drew more attendees than Times Square’s did.
“This is the future of partying,” she tells the

o p
5T7L3P4T9O6U
tr-5Sauthors.

What’s most striking about the metaverse (and its cousin, Web3) is the emphasis on ownership. Users can have

tr-5S5T7Lof the
3P4T
T
a stake in almost anything; they can vote on decisions about the communities they belong to and the apps they
use, make and sell NFTs, and even get paid for playing games in decentralized apps (dApps) that run on peer-
to-peer networks rather than on servers. User ownership is a real revolution because it creates a new economy.
The best version metaverse, says van Rijmenam, will liberate users, allowing them to easily move
9O6U
communities and digital goods from platform to platform—to, say, take a Facebook group to Roblox, and then
transfer

Those are exactly the kinds of quandaries that characters work to escape in books, TV shows, and movies
about virtual reality, from Neal Stephenson’s 1992 sci-fi classic Snow Crash (which coined the term
“metaverse”) to the Netflix series Black Mirror.

Is the metaverse our future? Companies like Meta and Microsoft seem to think so, though their virtual worlds
remain closed rather than the open ideal. While the metaverse promises a digital utopia, users find themselves
working harder than ever to play, as they transform leisure activities into small businesses .There’s no doubt
9K6G
tr-5I5H7I3E4E
that excitement, money, and momentum are pushing us to some new form of digital reality. One way or
another, it will reflect the desires of its user base, be they entrepreneurship, escape, or convenience. Dystopia is
one risk. Another is disappointment: We dream of the metaverse but end up with a mall.

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 6 of 20
4. Article 04 - Natural Selection
First published Wed Sep 25, 2019

Charles Darwin and Alfred Wallace are the two co-discoverers of natural selection (Darwin & Wallace 1858),
though, between the two, Darwin is the principal theorist of the notion whose most famous work on the topic is
On the Origin of Species (Darwin 1859). For Darwin, natural selection is a drawn-out, complex process
involving multiple interconnected causes. Natural selection requires variation in a population of organisms.
For the process to work, at least some of that variation must be heritable and passed on to organisms’
descendants in some way. That variation is acted upon by the struggle for existence, a process that in effect
“selects” variations conducive to the survival and reproduction of their bearers. Much like breeders choose
which of their animals will reproduce and thereby create the various breeds of domesticated dogs, pigeons, and
cattle, nature effectively “selects” which animals will breed and creates evolutionary change just as breeders

m
do. Such “selection” by nature, natural selection, occurs as a result of the struggle for existence and, in the case
of sexual populations, the struggle for mating opportunities. That struggle is itself the result of checks on the

o
geometric population increase that would occur in the absence of the checks. All populations, even slow-

c
breeding ones such as those of elephants, will increase in size in the absence of limitations on growth that are

.
imposed by nature. These checks take different forms in different populations. Such limitations may take the

rs
form of limited food supply, limited nesting sites, predation, disease, harsh climactic conditions, and much else
besides. One way or another, only some of the candidate reproducers in natural populations actually do

k e
reproduce, often because others simply die before maturity. Owing to the variations among the candidate
reproducers, some have better chances of making it into the sample of actual reproducers than do others. If

r an
such variations are heritable, the offspring of those with the “beneficial” traits will be likely to produce
especially many further descendants themselves. To use one of Darwin’s own examples, wolves with
especially long legs that allow them to run more quickly will be more likely to catch prey and thereby avoid
starvation and so produce offspring that have especially long legs that allow them, in turn, to breed and
produce still more long-legged descendants,

o p
5T7L3P4T9O6U
tr-5Sand so on. By means of this iterative process, a trait conducive to
reproduction that is initially found in one or a few population members will spread through the population.

tr-5S which
5T7L3P
T
Multiple bouts of Darwin’s process involving different traits, acting sequentially or in concert, may then
explain both how speciation and the evolution of complex adaptations occur through the gradual evolution
(change over time) of natural populations. Darwin aimed to convince his audience that even such structures as
the vertebrate eye, at first seem explicable only as the product of design, could instead be explained by
4T9O6U
incremental, directional evolution, a complex but still naturalistic process (Darwin 1859: ch. 6). What is
initially a light sensitive patch may be transformed into an eye by means of a great many bouts of selection that
gradually improve and enhance its sensitivity. Showing that something is explicable is importantly different
from explaining it (Lennox 1991); still, a theory must be an explanatory sort of theory for it to accomplish
either task. After Darwin, the appearance of novel species in the geological record and the existence of
designed-appearing adaptations cannot be used as grounds for invoking supernatural causes as a matter of last
explanatory resort.

1. Two Conceptions of Natural Selection


Natural selection is chiefly discussed in 9K
two6G different ways among contemporary philosophers and biologists.
tr-5I5H7I3E4E
One usage, the “focused” one, aims to capture only a single element of one iteration of Darwin’s process under
the rubric “natural selection”, while the other, the “capacious” usage, aims to capture a full cycle under the
same rubric. These are clearly alternative, non-competing uses of the term, and distinct philosophical
controversies surround each one.
In Darwin’s wake, theorists have developed formal, quantitative approaches to modeling Darwin’s process.
The “focused” usage of natural selection finds its home as an interpretation of a single term in some of these
formalisms (and only some of them).

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 7 of 20
Some philosophers’ definitions of natural selection are clearly intended to capture this focused usage of the
term. Millstein, for instance, characterizes selection as a discriminate sampling process (Millstein 2002: 35).
Otsuka identifies natural selection with the causal influence of traits on offspring number in causal-graphical
models (Otsuka 2016: 265). Okasha interprets the covariance of offspring number and offspring phenotype as
quantifying the causal influence of selection (Okasha 2006: 28) Clearly, these uses of “natural selection” are
meant to capture only an element of Darwin’s process; they make no mention of inheritance or replication. As
discussed further below, controversies over the focused notion of selection have to do with whether the focused
notion of selection can be distinguished from that of drift (section 3), and whether selection, in the focused
sense, should be counted as a cause (section 5).

The alternative, “capacious”, usage of the notion of natural selection is to capture Darwin’s process in its
entirety, rather than a single contributor to it. Because Darwin’s process is cyclical, specifying what is
sufficient for a single cycle of it, a single instance of, say, replication of genes for long legs caused by long-

m
legged wolves making narrow captures, is sufficient to specify a process that may explain adaptation and
speciation. This is true, anyway, when it is added that the process gets repeated. The capacious notion,

o
capturing a cycle of Darwin’s process, is used by Lewontin and later authors working in the same vein, who

c
put forward conditions for evolution by natural selection: these include variation, inheritance, and

.
reproduction. While falling within the scope of “natural selection” in the capacious sense used by Lewontin,

rs
these elements of Darwin’s process are treated as distinct from natural selection when that notion is used in its
focused sense.

Is Evolution Necessary for Natural Selection?

k e
n
One natural way to arbitrate the issue of whether systems that undergo selection must evolve is to attend to the

a
point of statements of principles of natural selection, or statements of the requirements for selection. Many

r
theorists take it that the point of these principles is to set out the scope of a theory in the special sciences that
deals with selection and evolution, evolutionary theory.6ULewontin claims that the theory of evolution by natural
selection rests on his three principles (1978:

o p
5T7L3P4T9O
tr-5S220). Equally, Godfrey-Smith claims that statements of conditions
for evolution by natural selection exhibit the coherence of evolutionary theory and capture some of its core
principles (2007: 489). Finally, Maynard-Smith (1991) offered a statement of conditions for selection that

T
include evolution as a necessary component, calling the theory so delineated, “Darwin’s theory”. For these
writers, the (or at least a) point of the principles seems to be to capture the domain of application of the theory
we have inherited from Darwin.
tr-5S5T7L3P4T
9O 6U

Darwin would have been surprised to hear that his theory of natural selection was circumscribed so as to apply
only to evolving populations. He himself constructed an explanation of a persistent polymorphism, heterostyly,
using his own theory. Plants exhibiting heterostyly develop two, or sometimes three, different forms of flower
whose reproductive organisms vary in a number of ways, principally length. Some plants exhibit different
forms of flower on the same plant, while some are dimorphic and trimorphic, with only one sort of flower per
plant. Darwin interpreted the flower variations as conducive to intercrossing, which he thought was beneficial,
at least for many organisms. Populations should not evolve directionally such that a single form of flower
spreads throughout the population; instead, multiple variants should be retained, a polymorphism. Darwin
thinks it clear that heterostyly is an adaptation:
9K6G
tr-5I5H7I3E4E
The benefit which heterostyled dimorphic plants derive from the existence of the two forms is sufficiently
obvious [….] Nothing can be better adapted for this end than the relative positions of the anthers and stigmas
in the two forms. (Darwin 1877: 30; thanks to Jim Lennox for this reference)

Even though the population is not evolving, but instead remaining the same over time, it exhibits an adaptation
that consists in this persistent lack of change, an adaptation that Darwin thought explicable using his theory.
For a more recent and especially compelling case of a selectionist explanation of a polymorphism, see Bayliss,
Field, and Moxon’s selectionist explanation of a fimbriae polymorphism produced by contingency genes
(Bayliss, Field, & Moxon 2001).
Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 8 of 20
Darwin thought his theory could explain a lack of evolution, and Darwinists in Darwin’s wake have explained
not only stable polymorphisms, but unstable ones, cyclical behaviors, protected polymorphisms, and a variety
of other behaviors that differ from simple directional evolution. These sorts of behaviors result from specific
assignments of values for theoretical parameters in many of the very same models that are used to explain
simple directional selection (where a single variant spreads throughout a population, as in the wolf case
discussed in the introduction). The point is that systems seemingly governed by evolutionary theory exhibit a
variety of different sorts of dynamics, and this variety includes both different sorts of evolution, including at
least cyclical and directional, as well as a lack of evolution at all, as in cases of stabilizing selection.

5. Article 05 - The Water Frame: Revolutionizing Textiles in the Industrial Age


The invention of the water frame rocked the textile industry during the Industrial Revolution, transforming
how we spin and produce yarn. Its impact resonates today as modern textile technology continues to build

m
upon the principles established by this pioneering invention.

o
From cutting-edge spinning machines to computer-controlled looms, today's textile industry embodies the

c
relentless pursuit of efficiency, productivity and innovation. The water frame paved the way for the
extraordinary advancements in textile manufacturing we experience in the modern era.

.
What Was the Water Frame?

ers
The water frame, also known as the spinning frame, is a mechanized spinning machine powered by water that

k
revolutionized the textile industry during the Industrial Age. Its primary function was to automate the process

n
of spinning cotton fibers into yarn. Unlike traditional spinning wheels that required human labor, the water

a
frame introduced mechanization to the spinning process, significantly increasing productivity and efficiency.

r
The main component of the water frame was a vertical6Uframe containing multiple spindles. Each spindle could
spin several threads simultaneously. By harnessing

o p 9O
tr-5S5T7L3P4T the power of water (typically through a system of belts,
pulleys and gears) the machine converted rotational energy into the spinning motion of the spindles. This
allowed for the rapid and consistent production of fine yarn at a much faster rate than laborers could do by
hand.

5S5Ttransmit
T
The water wheel combined water power — the energy of flowing or falling water — with mechanical systems
to generate tr-and mechanical energy. It acted as a middleman between the power of water and the
7L3P4T9O6U
operation of various machinery, allowing for the use of water power in different industrial processes.

Think of the water wheel as a bridge between the force of flowing or falling water and the operation of
machines, just like a connector that transfers power from a waterfall to a giant wheel, enabling it to turn and
provide energy for different types of machinery in industries.

Who Invented the Water Frame?


English engineer Richard Arkwright invented the water frame during the late 18th century. Passionate about
machinery and textiles, the self-taught inventor came up with the breakthrough idea for the water frame while
working as a wig maker. Observing7Ithe process
9K6G of using horsehair to create wigs, he realized he could apply a
tr-5I5H 3E4E
similar principle to spinning cotton fibers into yarn.

Arkwright's first successful prototype was a spinning machine powered by a water wheel, which he patented in
1769. It used multiple spindles that could spin several threads simultaneously, vastly improving productivity
compared to traditional hand-spinning methods.

This invention laid the foundation for the water frame, a later iteration that became even more efficient and
widely adopted in the textile industry.

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 9 of 20
Arkwright's innovative use of water power and his ability to design and refine spinning machinery marked a
significant milestone in the history of textile technology.

How the Water Frame Operates


A water wheel or turbine was connected to a series of belts, pulleys and gears to operate the water frame. When
water pulled from the river flowed onto the wheel or turbine, it created rotational energy. This energy was
transmitted through the machinery and powered the spinning process.

The spindles in the water frame were arranged vertically and could hold multiple bobbins, like a Ferris wheel,
standing tall and holding multiple baskets. Each bobbin had a length of roving, which consisted of cotton fibers
that had been carded and drawn out. As the water frame spun, the spindles twirled rapidly, twisting the cotton
fibers into smooth, fine yarn. Spinning multiple threads simultaneously surpassed the efficiency of traditional,
manual cotton-spinning methods.

Importance and Impact

o m
Richard Arkwright's water frame transformed the textile industry with its mechanized spinning process. Its

c
impact was far-reaching, revolutionizing production methods and paving the way for significant advancements
in cotton yarn manufacturing.

Concentration of Production

rs .
the concentration of resources in one location.

k e
The water frame set the stage for centralized textile production in factories, leading to economies of scale and

Continuous Production and Increased Output

r an
The mechanized spinning process, capable of producing cotton yarn consistently and at a faster rate than
manual methods, revolutionized the textile industry by6Uenabling continuous production even beyond daylight
hours, leading to drastic increases in output

o
tr-5Sand
frame could operate 24 hours a day without a break.
p
5T7L3P4T9O
driving the sector's growth. Unlike human laborers, the water

Economic Impact

Mechanized Spinning
T
The innovation increased productivity and efficiency, stimulating the textile industry's growth, expanding
markets for cotton goods and driving further technological advancements.
tr-5S5T7L3P4T
9O 6U

The water frame automated the labor-intensive spinning process, increasing speed, efficiency and consistency
in yarn production.

Transition to Factory System


The water-powered spinning machine played a crucial role in the development of the factory system,
facilitating the shift from small-scale cottage industries to large-scale factories.

Utilization of Water Power


The water frame harnessed the power of 9K
water
6G through waterwheels or turbines, allowing textile factories to be
tr-5I5H7I3E4E
established near rivers and streams for convenient energy supply.

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 10 of 20
6. Article 06 - Genetically Modified Organisms (GMOs): Transgenic Crops and Recombinant DNA
Technology
People have been altering the genomes of plants and animals for many years using outmoded breeding
techniques. Artificial selection for specific, desired traits has resulted in a variety of different organisms,
ranging from sweet corn to hairless cats. But this artificial selection, in which organisms that exhibit specific
traits are chosen to breed subsequent generations, has been limited to naturally occurring variations. In recent
decades, however, advances in the field of genetic engineering have allowed for precise control over the
genetic changes introduced into an organism. Today, we can incorporate new genes from one species into a
completely unrelated species through genetic engineering, optimizing agricultural performance or facilitating
the production of valuable pharmaceutical substances. Crop plants, farm animals, and soil bacteria are some of
the more prominent examples of organisms that have been subject to genetic engineering.

m
Current Use of Genetically Modified Organisms
Agricultural plants are one of the most frequently cited examples of genetically modified organisms (GMOs).

o
Some benefits of genetic engineering in agriculture are increased crop yields, reduced costs for food or drug

c
production, reduced need for pesticides, enhanced nutrient composition and food quality, resistance to pests

.
and disease, greater food security, and medical benefits to the world's growing population. Advances have also

rs
been made in developing crops that mature faster and tolerate aluminum, boron, salt, drought, frost, and other
environmental stressors, allowing plants to grow in conditions where they might not otherwise flourish (Table

k e
1; Takeda & Matsuoka, 2008). Other applications include the production of nonprotein (bioplastic) or
nonindustrial (ornamental plant) products. A number of animals have also been genetically engineered to

n
increase yield and decrease susceptibility to disease. For example, salmon have been engineered to grow larger

a
and mature faster, and cattle have been enhanced to exhibit resistance to mad cow disease (United States
Department of Energy, 2007).

r
Potential GMO Applications

o p tr-5S5T7L3P4T
9O6U

Many industries stand to benefit from additional GMO research. For instance, a number of microorganisms are
being considered as future clean fuel producers and biodegraders. In addition, genetically modified plants may

conducting large-scale
T
someday be used to produce recombinant vaccines. In fact, the concept of an oral vaccine expressed in plants
(fruits and vegetables) for direct consumption by individuals is being examined as a possible solution to the
spread of disease in underdeveloped countries, one that would greatly reduce the costs associated with
tr-5S5T7L3P4T vaccination campaigns. Work is currently underway to develop plant-derived vaccine
9O6U
candidates in potatoes and lettuce for hepatitis B virus (HBV), enterotoxigenic Escherichia coli (ETEC), and
Norwalk virus. Scientists are also looking into the production of other commercially valuable proteins in
plants, such as spider silk protein and polymers that are used in surgery or tissue replacement (Ma et al., 2003).
Genetically modified animals have even been used to grow transplant tissues and human transplant organs, a
concept called xenotransplantation. The rich variety of uses for GMOs provides a number of valuable benefits
to humans, but many people also worry about potential risks.

Risks and Controversies Surrounding the Use of GMOs


Despite the fact that the genes being transferred occur naturally in other species, there are unknown
consequences to altering the natural state
9K6Gof an organism through foreign gene expression. After all, such
tr-5I5H7I3E4E
alterations can change the organism's metabolism, growth rate, and/or response to external environmental
factors. These consequences influence not only the GMO itself, but also the natural environment in which that
organism is allowed to proliferate. Potential health risks to humans include the possibility of exposure to new
allergens in genetically modified foods, as well as the transfer of antibiotic-resistant genes to gut flora.

Unintended Impacts on Other Species: The Bt Corn Controversy


One example of public debate over the use of a genetically modified plant involves the case of Bt corn. Bt corn
expresses a protein from the bacterium Bacillus thuringiensis. Prior to construction of the recombinant corn,
the protein had long been known to be toxic to a number of pestiferous insects, including the monarch
Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 11 of 20
caterpillar, and it had been successfully used as an environmentally friendly insecticide for several years. The
benefit of the expression of this protein by corn plants is a reduction in the amount of insecticide that farmers
must apply to their crops. Unfortunately, seeds containing genes for recombinant proteins can cause
unintentional spread of recombinant genes or exposure of non-target organisms to new toxic compounds in the
environment.

Unintended Economic Consequences


Another concern associated with GMOs is that private companies will claim ownership of the organisms they
create and not share them at a reasonable cost with the public. If these claims are correct, it is argued that use
of genetically modified crops will hurt the economy and environment, because monoculture practices by large-
scale farm production centers (who can afford the costly seeds) will dominate over the diversity contributed by
small farmers who can't afford the technology. However, a recent meta-analysis of 15 studies reveals that, on
average, two-thirds of the benefits of first-generation genetically modified crops are shared downstream,

m
whereas only one-third accrues upstream (Demont et al., 2007). These benefit shares are exhibited in both
industrial and developing countries. Therefore, the argument that private companies will not share ownership
of GMOs is not supported by evidence from first-generation genetically modified crops.

Increased Research and Improved Safety Go Hand in Hand

.c o
rs
Proponents of the use of GMOs believe that, with adequate research, these organisms can be safely
commercialized. There are many experimental variations for expression and control of engineered genes that

k e
can be applied to minimize potential risks. Some of these practices are already necessary as a result of new
legislation, such as avoiding superfluous DNA transfer (vector sequences) and replacing selectable marker

genetically modified products from non-GMOs. Other


r an
genes commonly used in the lab (antibiotic resistance) with innocuous plant-derived markers (Ma et al., 2003).
Issues such as the risk of vaccine-expressing plants being mixed in with normal foodstuffs might be overcome
by having built-in identification factors, such as pigmentation, that facilitate monitoring and separation of
built-in control techniques include having inducible
promoters (e.g., induced by stress, chemicals,
growing seasons.

o
tr-5S5T

p
7L3P4T9O6U
etc.), geographic isolation, using male-sterile plants, and separate

T
GMOs benefit mankind when used for purposes such as increasing the availability and quality of food and
medical care, and contributing to a cleaner environment. If used wisely, they could result in an improved
economy without doing more harm than good, and they could also make the most of their potential to alleviate
hunger and disease worldwide.
tr-5S5T7L3P 4T9O6U However, the full potential of GMOs cannot be realized without due diligence
and thorough attention to the risks associated with each new GMO on a case-by-case basis.

7. Article 07 - What Is Datafication and Why It Is the Future of Business?

Although the term “datafication” was coined back in 2013, it’s only in recent times that this phenomenon has
been actively changing how organizations function and make decisions.

So, What Exactly is Datafication?


In business, datafication can be defined as6Ga process that “aims to transform most aspects of a business into
9K
tr-5I5H7I3E4E
quantifiable data that can be tracked, monitored, and analyzed. It refers to the use of tools and processes to turn
an organization into a data-driven enterprise.”

Authors Viktor Mayer-Schonberger and Kenneth Cukier – who unveiled the term datafication in their
groundbreaking book “Big Data: A Revolution That Will Transform How We Live, Work and Think” –
revealed the significance of this data-driven approach and its potential. The book’s 10 chapters delve into
different aspects of Big Data, including its applications, value, risks, and tools to manage it.

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 12 of 20
There are three areas of business where datafication can really make an impact:
Analytics: In today’s data-driven world, analytics is king. By collecting and analyzing data, businesses can
gain valuable insights into consumer behavior, trends, and preferences, allowing them to make informed
decisions that drive growth and success.

Marketing Campaigns: Marketing campaigns can be supercharged with datafication, allowing companies to
personalize ads and offers for specific customers based on their interests and behaviors.
Forecasting: Predictive analytics can help businesses forecast future trends and stay ahead of the competition
by anticipating changes in consumer demand.

What Makes Datafication the Way Forward for Businesses?


Before you formulate a datafication strategy, here are four considerations to keep in mind:

m
1. The Role of data in Decision-making and Strategy Development
In the current business landscape, datafication has the potential to fundamentally change the way companies

o
make decisions and formulate strategies. Data-driven insights can provide businesses with valuable

c
information about their operations, customers, and market trends. By analyzing data, businesses can pinpoint

.
areas for improvement, streamline their operations, and develop marketing strategies that are more effective.

2. The Potential Benefits of Datafication for Businesses.

ers
Businesses that embrace datafication can benefit in numerous ways, including increased efficiency, reduced

k
costs, and enhanced revenue. By leveraging data, companies can identify opportunities to optimize their

n
operations, create new revenue streams, and improve customer satisfaction. Additionally, businesses can offer

a
more personalized experiences and targeted promotions, leading to increased customer loyalty and repeat
business.

r
3. The Impact of Datafication on

o p Customer 5T7L3P4T9O6U
tr-5SExperience
Engagement. and
Datafication has dramatically transformed how companies interact with their customers, enabling them to
provide more personalized experiences and relevant content. Through the analysis of customer behavior data,

T
companies can offer customized recommendations, real-time directions, and targeted promotions. This level of
personalization leads to a better customer experience, resulting in increased engagement and loyalty.

4. The Competitive Advantage


tr-5S5T7L3P4T
9O6U of Data-Driven Companies
Companies that are data-driven have a significant competitive advantage over their peers. By leveraging data
to make better decisions, optimize their operations, and deliver more personalized experiences to customers,
these companies can create a level of sophistication that is challenging for competitors to replicate. As a result,
data-driven companies often dominate their markets, leaving their competition struggling to keep up.

Navigating the Risks and Challenges of Datafication in Business


Datafication also brings challenges and risks that companies must be aware of. Three key challenges and risks
in particular are:

1. Privacy and Security Concerns 7I3E4E9K6G


tr-5I5H
One of the most significant challenges of datafication is the potential for breaches of privacy and security.
Collecting, storing, and analyzing large amounts of data increases the risk of sensitive information being
compromised. Businesses must ensure that they have proper security measures in place to protect their
customers’ data and prevent unauthorized access.

2. Ethical Considerations in Data Collection and Use


As businesses collect and analyze data, they must also consider the ethical implications of their actions. The
use of data can raise concerns about privacy, consent, and transparency. Companies must be transparent about

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 13 of 20
what data they are collecting and how they use it to avoid ethical issues and maintain the trust of their
customers.

3. Potential Biases in Data Analysis


Another risk of datafication is the potential for biases in data analysis. Biases can occur in various forms, such
as sample selection bias or algorithmic bias. These biases can result in inaccurate or misleading insights, which
can have significant consequences for businesses. To avoid biases, companies must ensure that they use
reliable data sources and that their algorithms are designed to avoid discrimination.

Location Technology and Datafication – A Perfect Synergy


Location-based technology offers the ability to gather vast amounts of data that can be analyzed to offer
rich insights into a physical space and how people move within it. Mapsted‘s advanced location
intelligence solutions, for example, use the latest in sensors, machine learning, and artificial intelligence

m
to provide indoor location-based services. By leveraging technology that needs no hardware or
infrastructure, businesses can access accurate and reliable indoor location data, which can be used to

o
track customers’ behavior, movements, and interactions within a physical space, like a shopping mall or

c
airport. With this information, businesses can create personalized experiences for their customers,

.
including real-time directions, customized recommendations, and targeted promotions. But that’s not

rs
all!

How Can Mapsted Help Datafy Your Business?

k e
Mapsted’s technology also provides insights into how customers use physical spaces, identifying areas of

r an
congestion and analyzing foot traffic patterns. This data helps businesses make data-driven decisions that
improve the customer experience and increase revenue. Here’s a complete explainer of why indoor intelligence
matters at all. So, whether you’re looking to optimize your operations, create personalized experiences for your
customers, or collect data to make informed decisions, you’re covered with Mapsted’s world-leading

o p
3P4T9O6U
tr-5S5T7LThe
technology that is protected by 100+ patents. concept of datafication is here to stay. For businesses,
location-based technology provides a one-stop-shop for data gathering, indoor navigation, and rich analysis.
Looking for a masterpiece platform that can help your business use location technology for datafication? Turn

8.
to Mapsted today!

Genomics istr-the
5S5Tstudy
T
Article 08 - A Brief Guide to Genomics
of6Uthe total or part of the genetic or epigenetic sequence information of organisms, and
7L3P4T9O
attempts to understand the structure and function of these sequences and of downstream biological products.

What is DNA?
Deoxyribonucleic acid (DNA) is the chemical compound that contains the instructions needed to develop and
direct the activities of nearly all living organisms. DNA molecules are made of two twisting, paired strands,
often referred to as a double helix

Each DNA strand is made of four chemical units, called nucleotide bases, which comprise the genetic
"alphabet." The bases are adenine (A), thymine (T), guanine (G), and cytosine (C). Bases on opposite strands
pair specifically: an A always pairs7Iwith a6GT; a C always pairs with a G. The order of the As, Ts, Cs and Gs
9K
tr-5I5H 3E4E
determines the meaning of the information encoded in that part of the DNA molecule just as the order of letters
determines the meaning of a word.

What is a genome?
An organism's complete set of DNA is called its genome. Virtually every single cell in the body contains a
complete copy of the approximately 3 billion DNA base pairs, or letters, that make up the human genome.

With its four-letter language, DNA contains the information needed to build the entire human body. A gene
traditionally refers to the unit of DNA that carries the instructions for making a specific protein or set of
Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 14 of 20
proteins. Each of the estimated 20,000 to 25,000 genes in the human genome codes for an average of three
proteins.

Located on 23 pairs of chromosomes packed into the nucleus of a human cell, genes direct the production of
proteins with the assistance of enzymes and messenger molecules. Specifically, an enzyme copies the
information in a gene's DNA into a molecule called messenger ribonucleic acid (mRNA). The mRNA travels
out of the nucleus and into the cell's cytoplasm, where the mRNA is read by a tiny molecular machine called a
ribosome, and the information is used to link together small molecules called amino acids in the right order to
form a specific protein.

Proteins make up body structures like organs and tissue, as well as control chemical reactions and carry signals
between cells. If a cell's DNA is mutated, an abnormal protein may be produced, which can disrupt the body's
usual processes and lead to a disease such as cancer.

What is DNA sequencing?

o m
Sequencing simply means determining the exact order of the bases in a strand of DNA. Because bases exist as

c
pairs, and the identity of one of the bases in the pair determines the other member of the pair, researchers do
not have to report both bases of the pair.

rs .
In the most common type of sequencing used today, called sequencing by synthesis, DNA polymerase (the

k e
enzyme in cells that synthesizes DNA) is used to generate a new strand of DNA from a strand of interest. In the
sequencing reaction, the enzyme incorporates into the new DNA strand individual nucleotides that have been

n
chemically tagged with a fluorescent label. As this happens, the nucleotide is excited by a light source, and a

a
fluorescent signal is emitted and detected. The signal is different depending on which of the four nucleotides

r
was incorporated. This method can generate 'reads' of 125 nucleotides in a row and billions of reads at a time.

To assemble the sequence of all the

o p 7L3P4T9O6U
basestr-in5Sa5Tlarge piece
of DNA such as a gene, researchers need to read the
sequence of overlapping segments. This allows the longer sequence to be assembled from shorter pieces,
somewhat like putting together a linear jigsaw puzzle. In this process, each base has to be read not just once,

5S5T7L
T
but at least several times in the overlapping segments to ensure accuracy.

Researchers in the field of genomic medicine are like modern-day alchemists, tirelessly working to transmute
the genetic tr-code into invaluable
3P4T9O 6U insights for medical science and patient care. Researchers can use DNA
sequencing to search for genetic variations and/or mutations that may play a role in the development or
progression of a disease. The disease-causing change may be as small as the substitution, deletion, or addition
of a single base pair or as large as a deletion of thousands of bases.

What is the Human Genome Project?


The Human Genome Project, which was led at the National Institutes of Health (NIH) by the National Human
Genome Research Institute, produced a very high-quality version of the human genome sequence that is freely
available in public databases. That international project was successfully completed in April 2003, under
budget and more than two years ahead of schedule.
9K6G
tr-5I5H7I3E4E
The sequence is not that of one person, but is a composite derived from several individuals. Therefore, it is a
"representative" or generic sequence. To ensure anonymity of the DNA donors, more blood samples (nearly
100) were collected from volunteers than were used, and no names were attached to the samples that were
analyzed. Thus, not even the donors knew whether their samples were actually used.

The Human Genome Project was designed to generate a resource that could be used for a broad range of
biomedical studies. One such use is to look for the genetic variations that increase risk of specific diseases,
such as cancer, or to look for the type of genetic mutations frequently seen in cancerous cells. More research

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 15 of 20
can then be done to fully understand how the genome functions and to discover the genetic basis for health and
disease.

What are the implications for medical science?


Virtually every human ailment has some basis in our genes. Until recently, doctors were able to take the study
of genes, or genetics, into consideration only in cases of birth defects and a limited set of other diseases. These
were conditions, such as sickle cell anemia, which have very simple, predictable inheritance patterns because
each is caused by a change in a single gene.

With the vast trove of data about human DNA generated by the Human Genome Project and other genomic
research, scientists and clinicians have more powerful tools to study the role that multiple genetic factors
acting together and with the environment play in much more complex diseases. These diseases, such as cancer,
diabetes, and cardiovascular disease constitute the majority of health problems in the United States. Genome-

m
based research is already enabling medical researchers to develop improved diagnostics, more effective
therapeutic strategies, evidence-based approaches for demonstrating clinical efficacy, and better decision-

o
making tools for patients and providers. Ultimately, it appears inevitable that treatments will be tailored to a

c
patient's particular genomic makeup. Thus, the role of genetics in health care is starting to change profoundly
and the first examples of the era of genomic medicine are upon us.

.
rs
It is important to realize, however, that it often takes considerable time, effort, and funding to move discoveries

e
from the scientific laboratory into the medical clinic. Most new drugs based on genome-based research are

k
estimated to be at least 10 to 15 years away, though recent genome-driven efforts in lipid-lowering therapy

n
have considerably shortened that interval. According to biotechnology experts, it usually takes more than a

a
decade for a company to conduct the kinds of clinical studies needed to receive approval from the Food and
Drug Administration.

r
Screening and diagnostic tests, however,

o p are 7L3P4T9O6U
here. Rapid
progress is also being made in the emerging field of
tr- 5S5T

pharmacogenomics, which involves using information about a patient's genetic make-up to better tailor drug
therapy to their individual needs.

tr-5S5Ttypes
T
Clearly, genetics remains just one of several factors that contribute to people's risk of developing most
common diseases. Diet, lifestyle, and environmental exposures also come into play for many conditions,
including many of6Ucancer. Still, a deeper understanding of genetics will shed light on more than just
7L3P4T9O
hereditary risks by revealing the basic components of cells and, ultimately, explaining how all the various
elements work together to affect the human body in both health and disease.

9. Article 09 - What is cloud computing? Everything you need to know about the cloud explained
What is cloud computing, in simple terms?
Cloud computing is the delivery of computing services—including servers, storage, databases, networking,
software, analytics, and intelligence—over the Internet ("the cloud") to offer faster innovation, malleable
resources, and economies of scale.

How does cloud computing work? 7I3E4E9K6G


tr-5I5H
Rather than owning their own computing infrastructure or data centres, companies can rent access to anything
from applications to storage from a cloud service provider.

One benefit of using cloud-computing services is that firms can avoid the upfront cost and complexity of
owning and maintaining their own IT infrastructure, and instead simply pay for what they use, when they use
it.
In turn, providers of cloud-computing services can benefit from significant economies of scale by delivering
the same services to a wide range of customers.

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 16 of 20
What cloud-computing services are available?
Cloud-computing services cover a vast range of options now, from the basics of storage, networking and
processing power, through to natural language processing and artificial intelligence as well as standard office
applications. Pretty much any service that doesn't require you to be physically close to the computer hardware
that you are using can now be delivered via the cloud – even quantum computing.

What are examples of cloud computing?


Cloud computing buttresses a vast number of services. That includes consumer services like Gmail or the
cloud backup of the photos on your smartphone, though to the services that allow large enterprises to host all
their data and run all of their applications in the cloud. For example, Netflix relies on cloud-computing services
to run its video-streaming service and its other business systems, too.

Cloud computing is becoming the default option for many apps: software vendors are increasingly offering

m
their applications as services over the internet rather than standalone products as they try to switch to a
subscription model. However, there are potential downsides to cloud computing, in that it can also introduce
new costs and new risks for companies using it.

Why is it called cloud computing?

.c o
rs
A fundamental concept behind cloud computing is that the location of the service, and many of the details such
as the hardware or operating system on which it is running, are largely irrelevant to the user. It's with this in

k e
mind that the metaphor of the cloud was borrowed from old telecoms network schematics, in which the public
telephone network (and later the internet) was often represented as a cloud to denote that the location didn't

their services and data remains a key issue.

What is the history of cloud computing?


r n
matter – it was just a cloud of stuff. This is an over-simplification of course; for many customers, location of

a
Cloud computing as a term has been around

o
tr-5S5T

p
7L3P4T9O6U
since the early 2000s, but the concept of computing as a service
has been around for much, much longer – as far back as the 1960s, when computer bureaus would allow
companies to rent time on a mainframe, rather than have to buy one themselves.

amounts of data.
tr-5S5T7L3P4T
9O6U
T
These 'time-sharing' services were largely overtaken by the rise of the PC, which made owning a computer
much more affordable, and then in turn by the rise of corporate data centres where companies would store vast

But the concept of renting access to computing power has resurfaced again and again – in the application
service providers, utility computing, and grid computing of the late 1990s and early 2000s. This was followed
by cloud computing, which really took hold with the emergence of software as a service and hyperscale cloud-
computing providers such as Amazon Web Services.

How important is the cloud?


Building the infrastructure to support cloud computing now accounts for a significant chunk of all IT spending,
while spending on traditional, in-house IT slides as computing workloads continue to move to the cloud,
whether that is public cloud services offered
9K6G by vendors or private clouds built by enterprises themselves.
tr-5I5H7I3E4E
Indeed, it's increasingly clear that when it comes to enterprise computing platforms, like it or not, the cloud has
won.

Tech analyst Gartner predicts that as much as half of spending across application software, infrastructure
software, business process services and system infrastructure markets will have shifted to the cloud by 2025,
up from 41% in 2022. It estimates that almost two-thirds of spending on application software will be via cloud
computing, up from 57.7% in 2022.

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 17 of 20
What are the benefits of cloud computing?
The exact benefits will vary according to the type of cloud service being used but, fundamentally, using cloud
services means companies not having to buy or maintain their own computing infrastructure significantly
decreasing operational costs.

No more buying servers, updating applications or operating systems, or decommissioning and disposing of
hardware or software when it is out of date, as it is all taken care of by the supplier. For commodity
applications, such as email, it can make sense to switch to a cloud provider, rather than rely on in-house skills.
A company that specializes in running and securing these services is likely to have better skills and more
experienced staff than a small business could afford to hire, so cloud services may be able to deliver a more
secure and efficient service to end users.

Using cloud services means companies can move faster on projects and test out concepts without lengthy

m
procurement and big upfront costs, because firms only pay for the resources they consume. This concept of
business agility is often mentioned by cloud advocates as a key benefit. The ability to spin up new services

o
without the time and effort associated with traditional IT procurement should mean that it is easier to get going

c
with new applications faster. And if a new application turns out to be wildly popular, the elastic nature of the
cloud means it is easier to scale it up fast.

.
rs
For a company with an application that has big peaks in usage, such as one that is only used at a particular time

e
of the week or year, it might make financial sense to have it hosted in the cloud, rather than have dedicated

k
hardware and software laying idle for much of the time. Moving to a cloud-hosted application for services like

n
email or CRM could remove a burden on internal IT staff, and if such applications don't generate much

a
competitive advantage, there will be little other impact. Moving to a services model also moves spending from

r
capital expenditure (capex) to operational expenditure (opex), which may be useful for some companies.

What are the advantages and

o p
disadvantages
tr-5S 7L3P4T9O6U
of5Tcloud computing?
Cloud computing is not necessarily inexpensive than other forms of computing, just as renting is not always
cheaper than buying in the long term. If an application has a regular and predictable requirement for computing

may
tr-5S5T7L
T
services it may be more economical to provide that service in-house. But if the requirement is for a large
number of people requiring much investment in infrastructure then cloud does offer an economical choice.

Some companies be6Ureluctant to host sensitive data in a service that is also used by rivals creating data
3P4T9O
secrecy concerns. Moving to a SaaS application may also mean you are using the same applications as a rival,
which might make it hard to create any competitive advantage if that application is core to your business.

While it may be easy to start using a new cloud application, migrating existing data or apps to the cloud might
be much more complicated and expensive. And it seems there is now something of a shortage in cloud skills,
with staff with DevOps and multi-cloud monitoring and management knowledge in particularly short supply.

In one report, a significant proportion of experienced cloud users said they thought upfront migration costs
ultimately outweigh the long-term savings created by IaaS. And of course, you can only access your
applications if you have an internet 7Iconnection.
3E4E9K6G
tr-5I5H

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 18 of 20
10. Article 10 - 4 Areas of Cyber Risk That Boards Need to Address
by Sander Zeijlemaker, Chris Hetner, and Michael Siegel
In our technology-dependent society, the effectiveness of cyber risk governance of companies affects its stock
prices, as well as short-term and long-term shareholder value. New SEC cybersecurity rules provide a solid
basis for transparency. Unfortunately, monitoring the long-term effectiveness of a cyber risk management
strategy is not easy to grasp. This article provides four critical areas investors should be informed about for
evaluating its long-term effectiveness.
As technological innovations such as cloud computing, the Internet of Things, robotic process automation, and
predictive analytics are integrated into organizations, it makes them increasingly susceptible to cyber threats.
Fortune 1000 companies, for example, have a 25% probability of being breached, and 10% of them will face
multi-million loss. In smaller companies, 60% will be out of business within six months of a severe
cyberattack. This means that governing and assessing cyber risks becomes a prerequisite for successful
business performance — and that investors need to know how vulnerable companies really are.

o m
This need for transparency has been recognized by the regulators and facilitated by the new cyber security

c
rules. Currently, the U.S. Security and Exchange Commission (SEC) has increased its enforcement to ensure

incidents.

rs .
companies maintain adequate cybersecurity controls and appropriately disclose cyber-related risks and

underestimate the financial loss related to cyber threats. These can include:

k e
Unfortunately, our research shows that cyber risk is not easy to understand. Organizations seem often to

well as additional costs to recover from an attack.

r n
Immediate effects, such as business interruptions, decreases in production, and delays in product launches, as

a
Long-term consequences, such as damage to the company’s competitiveness and reputational loss, as well as

o p
loss of revenues from intellectual propertytr-theft,
5S5T7Ldata theft,
3P4T9O 6U or unauthorized use of proprietary information.

There’s also legal risks resulting from neglecting, for instance, cyber resilience obligations in products and

T
services, breach reporting, safeguarding of sensitive data, or critical infrastructure protection.
There isn’t a simple way forward, though. Overinvesting in cyber risk management or risk-management
strategies that don’t align with business needs can have equivalently negative impacts. This article explains the
importance tr-
of5S5T
the7L3P
SEC’s
4T9O6Unew cybersecurity rules and addresses the four essential topics investors should
discuss with the board for evaluating the long-term effectiveness of their companies’ cyber risk management
strategy.
Transparency in Cyber-Risk Governance
Transparency about cybersecurity is now a requirement for U.S. companies under the SEC's new rules. They
must disclose cybersecurity governance capabilities, including the board's oversight of cyber risk,
management's role in assessing and managing risks, relevant expertise, and implementation of cybersecurity
policies. This disclosure enables investors to assess executives' attention to cyber risks and understand their
potential material harm. For example, ransomware attacks on Hanesbrands and Tenet Healthcare caused $100
million losses in revenue, while the Kaseya VSA breach led to the postponement of an $875 million initial
public offering due to insecuretr-software.
5I5H7I3E4E9K
6G

Insight Center Collection


Managing Cyber Risk
Exploring the challenges and the solutions.
Under the new SEC guidelines companies are also required to report within four days of incidents that are
deemed “material.” The “materiality” determination is influenced by the incident’s impact on the company’s
business, operations, and financial conditions. This mandatory incident reporting allows investors to evaluate
the effectiveness of the firm’s cyber risk policies and may provide learnings for future improvements in cyber

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 19 of 20
risk management. And there is a significant opportunity for improvement since the cost of cyber crime —
including the cost for recovery and remediation — are expected to grow to $10.5 trillion per year by 2025.
4 Critical Areas Investors Should Expect Boards to Address
Consider these new cybersecurity rules as a starting point for the cyber-risk governance dialogue. To stay
ahead, companies must anticipate and prioritize cyber risk efforts based on the ever-changing internal and
external environment.
Cyber risk can be challenging to grasp, especially for busy board members dealing with various strategic
challenges. The complexity and dynamic nature of cyber risk issues, such as product growth versus security,
critical supplier dependencies, ransomware attacks, and geopolitical tensions, may lead to cybersecurity blind
spots. These blind spots can impact decision-making effectiveness and result in unintended consequences. The
"capability trap," a hidden deterioration of organizational processes, can persist unnoticed by management until
it's too late. It occurs more often than imagined.

1. Align cyber risk management with business needs.

c m
To avoid this trap, companies need to focus on long-term effectiveness of their strategic decisions in four areas:

o
Boards have many corporate challenges to face and limited amounts of funding available to meet them, so

rs .
being able to make the business case for this investment is essential. Clear insights into business, operational,
and financial exposures: 1) generate language to discuss cyber risks, 2) connect to board members who do not
have a technical background, and 3) put cyber risk on the agenda, as well as allow for comparing this risk with

k e
other corporate challenges. It also helps the board explain the cyber risk exposure of the firm to investors. The
National Association of Corporate Directors (NACD) recognizes this need and deployed a commercially

n
available solution to its members.

that need protection, imposing an ever-increasing


r
and
tr-5S5T7L3P4T a
2. Continuously monitor the cyber risk capability performance.
The people, processes, and technology that make up firms is changing — and there are more and more areas

p dynamically shifting burden on the security capabilities


9O6U
of the organization, making lapses more likely. Solving these problems may require significant security

o
capability improvements, which may take several months or even years.

T
Continuous monitoring is essential to establish if the cyber-risk management strategy performs as intended.
Often management reporting dashboards, combined with insights from cyber event exercises are used for this
purpose. Currently, in their most advanced form, these activities can capture the near real-time situation. Yet,
for bridgingtr-the
5S5Ttiming
7L3P4T9Ogap
6U for utilizing improvements decision-makers have a need to see what the future

outcome of their strategic decisions. This evokes the need for simulation aided approaches to strengthen
managerial foresight capabilities.
3. Proactively anticipate to the changing threat landscape.
Digital transformation accelerates adversarial behavior, intensifying the ongoing struggle between offensive
and defensive forces. Both sides continuously observe, learn, and innovate to outmaneuver each other.
Proactive cyber risk management enables organizations to learn from shared information and exercises,
improving security capabilities ahead of potential attacks. This approach reduces significant security incidents.
In contrast, reactive learning, based on incidents suffered, is costlier and leads to suboptimal decisions for 56%
of knowledgeable decision-makers. Overspending on cyber risk management impacts firm profitability.
9K6G
tr-5I5H7I3E4E
4. Position security as a strategic business enabler.
Implementing a cyber-risk management strategy is challenging due to increasing surfaces requiring protection
and rising adversarial behavior. Cybersecurity teams face difficulties with a shortage of qualified resources,
with over 750,000 job openings in the United States alone. To prepare for future defense postures, reducing
ongoing workload becomes essential. Secure by design, collaboration, automation, and economies of scale
play crucial roles in achieving a secure future state. Failure to adapt exposes organizations to control lapses and
reactive learning. The SEC's new cybersecurity rules promote transparency in cyber-risk governance, forming
a basis for dialogue with the board. This article highlights four critical areas for this discussion.

Head Office: 127, Zone II, MP Nagar, Bhopal |+91-7676564400| https://www.toprankers.com Page 20 of 20

You might also like