200 Questões #1 - Inglês - Técnico Bancário - Caixa
200 Questões #1 - Inglês - Técnico Bancário - Caixa
Inglês
Técnico Bancário Novo
Tec Concursos
Caixa 2024
23/03/2024, 16:34 Tec Concursos - Questões para concursos, provas, editais, simulados.
www.tecconcursos.com.br/questoes/2779515
Americans might be reluctant to believe it, but on paper, the U.S. economy is doing pretty well. So well, in fact, that we’re performing better than forecasts made even
before the pandemic began.
The nation’s employers added another 199,000 jobs in November, the U.S. Bureau of Labor Statistics reported on Friday. This means that overall employment is now 2
million jobs higher than was expected by now in forecasts made way back in January 2020 by the nonpartisan Congressional Budget Office.
The job market isn’t the only front on which we have bested forecasts made before the pandemic. The overall size of the economy, as measured by gross domestic
product, is larger than it was expected to be around now. The International Monetary Fund says that U.S. gross domestic product is higher today, in inflation-adjusted
terms, than it had expected at the beginning of 2020. The IMF ran these calculations for countries around the world, and found the United States was an outlier in beating
the organization’s pre-covid forecasts.
So why did well-regarded professional forecasters underestimate the strength of the economy? And how is it that jobs and GDP are doing better than they expected, even
as inflation has been unmistakably worse?
To some extent, all these things are related. Forecasters obviously did not anticipate the pandemic. They also did not anticipate the unprecedentedly enormous government
response to the coronavirus. When the public health crisis hit and disemployed millions of American workers, policymakers implemented unusually generous fiscal and
monetary stimulus.
Such measures helped get people back to work sooner, and avoided the long, painful effort back to normal that had followed the Great Recession. Thus, faster job growth.
They also massively amplified consumer demand, at a time when the productive capacity of the economy (i.e., companies’ ability to make and deliver the things their
customers want) couldn’t keep up. Employers faced all kinds of shortages — of products, materials, workers — and consumers anxious to buy stuff raised the prices of
whatever inventory companies actually had available. Thus, faster price growth.
If you had asked me back in January 2020 how Americans might feel about an economy with an “extra” 2 million jobs, unemployment less than 4 percent, and inflation just
over 3 percent, I probably would have guessed the public would be pretty content. However people are still furious about the extra price growth they’ve already endured to
date, and unimpressed by all that extra job growth. Maybe it’s human nature for people to view better jobs or pay as things they’ve earned, while a painful price increase
is something inflicted upon them — even if both are, to some extent, two sides of the same coin.
Available at: https://www.washingtonpost.com/opinions/ 2023/12/08/jobs-report-economy-beats-pandemicpredictions/. Retrieved on: Dec. 12, 2023. Adapted.
According to Text I,
a) although the job market and the GDP are getting worse, prices are decreasing.
b) the increasing unemployment has contributed to people’s positive perception of the U.S. economy.
c) the current predictions about the U.S. economy indicate that the job market will worsen in the future.
d) despite people’s negative perception, the U.S. economy is doing well.
e) excessively positive forecasts about the U.S. economy have pushed prices up.
www.tecconcursos.com.br/questoes/2397932
Inflation is the most serious problem facing the Federal Reserve and “may take some time” to address, Fed Governor Philip Jefferson said on Tuesday in his first public
remarks since joining the U.S. central bank’s governing body.
“Restoring price stability may take some time and will likely result in a period of below-trend growth,” Jefferson told a conference in Atlanta, joining the current Fed
consensus for continued interest rate increases to battle price pressures.
“I want to assure you that my colleagues and I are resolute that we will bring inflation back down to 2% ... We are committed to taking the further steps necessary.”
Monetary policy that stabilizes inflation “can produce long-term, noninflationary economic expansions ... that economic history suggests is an ideal framework or
environment for inclusive growth,” Jefferson said. “So, it is important that we get back to that kind of economy. And that is what I think the intent of the Fed is.”
Fed Chair Jerome Powell has admitted that the central bank’s intent to slow economic growth will cause economic “pain” and likely increased unemployment, but that the
worst outcome would be to let inflation take root.
In his remarks, Jefferson said there are reasons to think rigid conditions in the labor market are already easing. Indeed new data on Tuesday showed a severe decrease in
job openings in August that began to bring the number of workers sought by companies more in line with the numbers of unemployed.
That could help reduce salary growth, Jefferson said, and there were indications as well that “supply bottlenecks have, finally, begun to resolve,” and could also help slow
down price increases.
www.tecconcursos.com.br/questoes/2397937
Inflation is the most serious problem facing the Federal Reserve and “may take some time” to address, Fed Governor Philip Jefferson said on Tuesday in his first public
remarks since joining the U.S. central bank’s governing body.
“Restoring price stability may take some time and will likely result in a period of below-trend growth,” Jefferson told a conference in Atlanta, joining the current Fed
consensus for continued interest rate increases to battle price pressures.
“I want to assure you that my colleagues and I are resolute that we will bring inflation back down to 2% ... We are committed to taking the further steps necessary.”
Monetary policy that stabilizes inflation “can produce long-term, noninflationary economic expansions ... that economic history suggests is an ideal framework or
environment for inclusive growth,” Jefferson said. “So, it is important that we get back to that kind of economy. And that is what I think the intent of the Fed is.”
Fed Chair Jerome Powell has admitted that the central bank’s intent to slow economic growth will cause economic “pain” and likely increased unemployment, but that the
worst outcome would be to let inflation take root.
In his remarks, Jefferson said there are reasons to think rigid conditions in the labor market are already easing. Indeed new data on Tuesday showed a severe decrease in
job openings in August that began to bring the number of workers sought by companies more in line with the numbers of unemployed.
That could help reduce salary growth, Jefferson said, and there were indications as well that “supply bottlenecks have, finally, begun to resolve,” and could also help slow
down price increases.
But it remains uncertain how that will work, and in the meantime “inflation remains elevated, and this is the problem that concerns me most,” Jefferson said. “Inflation
creates economic burdens for households and businesses, and everyone feels its effects.”
In the segment of paragraph “the worst outcome would be to let inflation take root”, the words would be signal
a) a certain future
b) a definite past
c) a hypothetical possibility
d) an indefinite present
e) an inevitable destiny
www.tecconcursos.com.br/questoes/2398730
New age technologies such as Artificial Intelligence (AI) and Machine Learning (ML) have radically transformed the way banking works today.
Thanks to AI, it is possible to conduct real-time data analysis from a large volume of data sets and provide customized solutions to banking customers.
With powerful AI tools, banks can make informed decisions faster by using predictive analysis, which is the central point of AI and ML. As soon as a potential customer
searches for something online, the AI tools pick it up and serve related content that leads to quick sales. This improves customer service tremendously as customers find
tailor-made solutions without much human intervention.
Banks’ lending processes have also improved considerably as they can analyze customers’ spending patterns, study different customer data points, and determine
borrowers’ credit conditions. So, there is much less paperwork.
Customer-centric banking has become indispensable with the introduction of different kinds of software that utilize Natural Language Processing (NLP) to read, process and
understand text and speech. Banks have successfully installed digital tools to answer customer questions, which has helped them reduce the time and effort of human
capital and provide quick and consistent service. Using those resources, banks are expected to save $7.3 billion in operational costs.
The changing profile of banking depends a lot on the Internet-age generation. Their expectations from their banks to provide an omni-digital experience have enabled the
shift, allowing them to fulfil their banking needs sitting from a remote location. Appropriately, banks quickly jumped onto the digitalization movement and refreshed their
services in line with their requirements.
Mobile banking, for example, is very popular among millennials. An Insider Intelligence’s Mobile Banking Competitive Edge study indicated that a surprising 97% of them
use mobile banking! Transferring funds, checking their transactions online, downloading their account statements or even applying for a loan is possible through a click of
fingers on their mobile phones. This has also eliminated the need for physical branches, enabling banks to operate in a lean manner and cut unnecessary costs.
The changing customer profile inclines towards bringing both physical and digital worlds closer, and this is influencing the finance and banking sector favorably. Banks give
attention to this need for digitalization to retain their customers in the long run.
The pandemic of Covid-19 helped the banking industry to depend heavily on digital technology and tech-enabled systems to stay alive. The result of the pandemic,
however, resulted in new beginnings in the form of huge digital transformation and newer business models for the banks.
The favorable impact of technology is obvious across banking institutions. Even though the banking arena has advanced in achieving digital involvement, many more
unexploited opportunities exist for banks. The banks must maintain the sanctity of their customers’ data and serve them with better solutions without having to sacrifice
their security. The few challenges the banking sector still has are data breaches or escapes, lack of e-banking knowledge amongst their customers, and the permanent
technological landscape that requires constant training and updating. Plausible solutions to the above are available with a positive partnership between all stakeholders
involved, such as government, industry professionals and, of course, different banking institutions.
www.tecconcursos.com.br/questoes/2398742
New age technologies such as Artificial Intelligence (AI) and Machine Learning (ML) have radically transformed the way banking works today.
Thanks to AI, it is possible to conduct real-time data analysis from a large volume of data sets and provide customized solutions to banking customers.
With powerful AI tools, banks can make informed decisions faster by using predictive analysis, which is the central point of AI and ML. As soon as a potential customer
searches for something online, the AI tools pick it up and serve related content that leads to quick sales. This improves customer service tremendously as customers find
tailor-made solutions without much human intervention.
Banks’ lending processes have also improved considerably as they can analyze customers’ spending patterns, study different customer data points, and determine
borrowers’ credit conditions. So, there is much less paperwork.
Customer-centric banking has become indispensable with the introduction of different kinds of software that utilize Natural Language Processing (NLP) to read, process and
understand text and speech. Banks have successfully installed digital tools to answer customer questions, which has helped them reduce the time and effort of human
capital and provide quick and consistent service. Using those resources, banks are expected to save $7.3 billion in operational costs.
The changing profile of banking depends a lot on the Internet-age generation. Their expectations from their banks to provide an omni-digital experience have enabled the
shift, allowing them to fulfil their banking needs sitting from a remote location. Appropriately, banks quickly jumped onto the digitalization movement and refreshed their
services in line with their requirements.
Mobile banking, for example, is very popular among millennials. An Insider Intelligence’s Mobile Banking Competitive Edge study indicated that a surprising 97% of them
use mobile banking! Transferring funds, checking their transactions online, downloading their account statements or even applying for a loan is possible through a click of
fingers on their mobile phones. This has also eliminated the need for physical branches, enabling banks to operate in a lean manner and cut unnecessary costs.
The usage of credit cards, debit cards, mobile banking apps, mobile wallets, third-party payment apps, etc., have all increased considerably, indicating an essential shift in
the customers’ preferences. Banks have modernized their processes and broken the barriers between the different entities involved, such as branches, ATMs, and online
banking, to create a continuous flow for their customers.
The changing customer profile inclines towards bringing both physical and digital worlds closer, and this is influencing the finance and banking sector favorably. Banks give
attention to this need for digitalization to retain their customers in the long run.
The pandemic of Covid-19 helped the banking industry to depend heavily on digital technology and tech-enabled systems to stay alive. The result of the pandemic,
however, resulted in new beginnings in the form of huge digital transformation and newer business models for the banks.
The favorable impact of technology is obvious across banking institutions. Even though the banking arena has advanced in achieving digital involvement, many more
unexploited opportunities exist for banks. The banks must maintain the sanctity of their customers’ data and serve them with better solutions without having to sacrifice
their security. The few challenges the banking sector still has are data breaches or escapes, lack of e-banking knowledge amongst their customers, and the permanent
technological landscape that requires constant training and updating. Plausible solutions to the above are available with a positive partnership between all stakeholders
involved, such as government, industry professionals and, of course, different banking institutions.
In the fragment in the paragraph of the text, “The changing profile of banking depends a lot on the Internet-age generation”, the expression in bold refers to people
who
a) do not have digital equipment.
b) dislike digital communication.
c) can not use the world wide web.
d) constantly use Internet services.
e) do not use virtual communication.
New age technologies such as Artificial Intelligence (AI) and Machine Learning (ML) have radically transformed the way banking works today.
Thanks to AI, it is possible to conduct real-time data analysis from a large volume of data sets and provide customized solutions to banking customers.
With powerful AI tools, banks can make informed decisions faster by using predictive analysis, which is the central point of AI and ML. As soon as a potential customer
searches for something online, the AI tools pick it up and serve related content that leads to quick sales. This improves customer service tremendously as customers find
tailor-made solutions without much human intervention.
Banks’ lending processes have also improved considerably as they can analyze customers’ spending patterns, study different customer data points, and determine
borrowers’ credit conditions. So, there is much less paperwork.
Customer-centric banking has become indispensable with the introduction of different kinds of software that utilize Natural Language Processing (NLP) to read, process and
understand text and speech. Banks have successfully installed digital tools to answer customer questions, which has helped them reduce the time and effort of human
capital and provide quick and consistent service. Using those resources, banks are expected to save $7.3 billion in operational costs.
The changing profile of banking depends a lot on the Internet-age generation. Their expectations from their banks to provide an omni-digital experience have enabled the
shift, allowing them to fulfil their banking needs sitting from a remote location. Appropriately, banks quickly jumped onto the digitalization movement and refreshed their
services in line with their requirements.
Mobile banking, for example, is very popular among millennials. An Insider Intelligence’s Mobile Banking Competitive Edge study indicated that a surprising 97% of them
use mobile banking! Transferring funds, checking their transactions online, downloading their account statements or even applying for a loan is possible through a click of
fingers on their mobile phones. This has also eliminated the need for physical branches, enabling banks to operate in a lean manner and cut unnecessary costs.
The usage of credit cards, debit cards, mobile banking apps, mobile wallets, third-party payment apps, etc., have all increased considerably, indicating an essential shift in
the customers’ preferences. Banks have modernized their processes and broken the barriers between the different entities involved, such as branches, ATMs, and online
banking, to create a continuous flow for their customers.
The changing customer profile inclines towards bringing both physical and digital worlds closer, and this is influencing the finance and banking sector favorably. Banks give
attention to this need for digitalization to retain their customers in the long run.
The pandemic of Covid-19 helped the banking industry to depend heavily on digital technology and tech-enabled systems to stay alive. The result of the pandemic,
however, resulted in new beginnings in the form of huge digital transformation and newer business models for the banks.
The favorable impact of technology is obvious across banking institutions. Even though the banking arena has advanced in achieving digital involvement, many more
unexploited opportunities exist for banks. The banks must maintain the sanctity of their customers’ data and serve them with better solutions without having to sacrifice
their security. The few challenges the banking sector still has are data breaches or escapes, lack of e-banking knowledge amongst their customers, and the permanent
technological landscape that requires constant training and updating. Plausible solutions to the above are available with a positive partnership between all stakeholders
involved, such as government, industry professionals and, of course, different banking institutions.
www.tecconcursos.com.br/questoes/2399407
This doesn’t necessarily mean you want them to have the best clothes, the latest toys or coolest gadgets.
Most likely, it means you want to lay a foundation that they can build upon to do well in life. “Without a working knowledge of money, it is extraordinarily difficult to do well
in life,” says Sam X. Renick, cocreator of Sammy Rabbit, a children’s character and financial literacy initiative. “Money is central to managing life, day-in and day-out.
Where we live, what we eat, the clothes we wear, the car we drive, health care, education, child-raising, gift giving, vacations, entertainment, heat, air-conditioning,
insurance—you name it, money is involved.” If you want to play a key role in shaping your children’s feelings, thinking and values about money, you need to give them the
gift of financial literacy from an early age. Lessons should begin before age seven, Renick says, because research shows that money habits and attitudes are already
formed by then. Actually, showing them how money works is more effective, so let them see you buying things with cash.
Your kids’ early interactions with money will likely involve spending. They see you using it to buy things, including things for them. So it’s important to teach them from a
young age that money isn’t just for spending— they should be saving money regularly, too. “Saving teaches discipline and delayed gratification,” Renick says. “Saving
teaches goal-setting and planning. It emphasizes being prepared, and it builds security and independence.” Help your kids get in the habit of saving by giving them a piggy
bank or savings jar where they can deposit coins or cash.
Kids need to have money of their own so they can learn how to make decisions about using it. An allowance can accomplish that. However, you should consider requiring
your kids to do certain tasks to earn their allowance. “Just about everyone values money they earn differently than money they receive,” Renick says. There are some kinds
of housework the kids have to do without pay because they’re expected to help out as part of a family. But they can have specific activities they need to complete if they
want to get paid.
In addition to wanting his kids to understand that money is earned, it is important that they can learn to live within a budget. “My two youngest children would constantly
ask for money and spend like drunken sailors,” says Tim Sheehan, co-founder and CEO of Greenlight, a debit card for kids with parental controls. “When I started paying
A key reason that it is important for you, as a parent, to teach your kids financial lessons is because you can share your money values through those lessons. If you value
giving to others, you can introduce that value to your children by helping make it a habit for them from an early age. You could do as Chase Peckham – from the San Diego
Financial Literacy Center – did with his kids, when they were little, and create spending, saving and giving jars.
Then help your children plan their giving by discussing what groups or causes they want to support.
Just as important as the lessons you teach your kids about money are the ways you discuss and handle money when you’re around them. For example, if you complain
about having to spend too much on certain things and then take your kids out for compulsive shopping, you’re sending mixed messages. If you want your children to
develop good spending and saving habits, they need to see you making smart spending and saving choices. In short, practice what you preach. And preach with
consistency. Educating your children about personal finance is a process that can take time. But if you put in the effort and continuously communicate a clear message
about money, you will instill good habits that will serve your children well.
a) demonstrate the ineffectiveness of teaching small children how to deal with money.
b) show parents the importance of teaching children how to use money and ways to do it.
c) prove the point that giving children money will have a negative effect in their adult life.
d) list the biggest difficulties and challenges of teaching personal finances to children.
e) affirm that money habits can’t be taught to children as effectively before their teens.
www.tecconcursos.com.br/questoes/2400841
By Rich Beattie
What we call “money” has always been a moving target. It changes appearance and value. Here are five key developments in the history of money that have impacted how
we earn, save and spend today.
Cash Cows - Before humans had money, they had stuff. In ancient times, when you had stuff other people wanted, you bartered it for stuff you wanted.
Around 9000 BC, the most popular commodities included things like cattle, sheep and camels. This was fine when people bartered close to home, but bulky creatures are
cumbersome and difficult to transport. As people started to venture farther afield to trade, a more portable option became essential.
In 1200 BC people started using cowries—the shells of marine mollusks taken from oceans. They were recognized as precious, and their use spread across Asia, Africa,
Oceania and Europe. Having been in use for centuries — even into the 20th century in some places—cowries win the prize as the world’s longest-running currency.
Three Coins in the Fountain - The issue with bartering became assigning value: Just how much was a cowrie or a cow worth? So, agreeing on the value of money
became essential. It was the Lydians, around 600 BC, who get credit for a critical step in this process: fashioning the first known coins, which were made of a gold and
silver alloy.
The metal used to make a coin—along with its weight—was important, as it denoted the money’s value. Moreover, as coins gained popularity, so did the idea of adorning
them with locally inspired designs.
Coins were money, but they now doubled as a historic record. Eventually, they took on even more uses: People flipped them to make decisions and tossed them into wells
while making wishes. They may be used less in 2020, but coins have been an integral part of our culture for centuries.
The Paper Chase - Coins were obviously lighter and easier to transport than cows, but carrying bags of heavy metal still wasn’t very practical. China’s Tang Dynasty, in
the seventh century, came up with a smart solution, namely, paper money. It was super-light and could feature even richer designs than coins, and it promised a certain
amount of purchasing power.
Gold Rush - One of the problems, though, was that counterfeiters had great success with paper bills.
The bigger problem came when governments faced economic crises; it was far too easy to print more paper money, which led to skyrocketing inflation.
Paper needed a backup—something universally valued yet not easily replicated. Something like gold.
The “gold standard” let governments create a fixed price for this precious metal that was tied directly to the value of their currency. In the United States, the idea took root
in the late 17th century, and it spread to Europe in the 19th century. But confidence in the gold standard crumbled during World War I, and it soon became apparent that in
order to thrive, currencies needed the freedom to fluctuate dynamically against each other. The gold standard was dropped in the United States in 1933, and a global
economy started to take shape.
The E-Buck Stops Here? - Cows, cowries, coins, paper, gold: Money has always had a physical presence. But today, it is quickly evolving into numbers that float through
the ether. This modern era of money began in 1946 with the first bank-issued charge card. Credit cards followed some 12 years later, still related to dollars. However,
technology, with cryptocurrencies like Bitcoin, is changing the world’s definition of “money.”
Now, social media companies and entire countries are considering digital currencies of their own. Meanwhile, artificial intelligence is growing eversmarter, and perhaps
one day soon your budget and expenses will be managing themselves. The debate rages about exactly where we are headed, but with history as our guide, the one thing
we can absolutely count on is the inevitability of change.
www.tecconcursos.com.br/questoes/2400844
By Rich Beattie
What we call “money” has always been a moving target. It changes appearance and value. Here are five key developments in the history of money that have impacted how
we earn, save and spend today.
Cash Cows - Before humans had money, they had stuff. In ancient times, when you had stuff other people wanted, you bartered it for stuff you wanted.
Around 9000 BC, the most popular commodities included things like cattle, sheep and camels. This was fine when people bartered close to home, but bulky creatures are
cumbersome and difficult to transport. As people started to venture farther afield to trade, a more portable option became essential.
In 1200 BC people started using cowries—the shells of marine mollusks taken from oceans. They were recognized as precious, and their use spread across Asia, Africa,
Oceania and Europe. Having been in use for centuries — even into the 20th century in some places—cowries win the prize as the world’s longest-running currency.
Three Coins in the Fountain - The issue with bartering became assigning value: Just how much was a cowrie or a cow worth? So, agreeing on the value of money
became essential. It was the Lydians, around 600 BC, who get credit for a critical step in this process: fashioning the first known coins, which were made of a gold and
silver alloy.
The metal used to make a coin—along with its weight—was important, as it denoted the money’s value. Moreover, as coins gained popularity, so did the idea of adorning
them with locally inspired designs.
Coins were money, but they now doubled as a historic record. Eventually, they took on even more uses: People flipped them to make decisions and tossed them into wells
while making wishes. They may be used less in 2020, but coins have been an integral part of our culture for centuries.
The Paper Chase - Coins were obviously lighter and easier to transport than cows, but carrying bags of heavy metal still wasn’t very practical. China’s Tang Dynasty, in
the seventh century, came up with a smart solution, namely, paper money. It was super-light and could feature even richer designs than coins, and it promised a certain
amount of purchasing power.
Gold Rush - One of the problems, though, was that counterfeiters had great success with paper bills.
The bigger problem came when governments faced economic crises; it was far too easy to print more paper money, which led to skyrocketing inflation.
Paper needed a backup—something universally valued yet not easily replicated. Something like gold.
The “gold standard” let governments create a fixed price for this precious metal that was tied directly to the value of their currency. In the United States, the idea took root
in the late 17th century, and it spread to Europe in the 19th century. But confidence in the gold standard crumbled during World War I, and it soon became apparent that in
order to thrive, currencies needed the freedom to fluctuate dynamically against each other. The gold standard was dropped in the United States in 1933, and a global
economy started to take shape.
The E-Buck Stops Here? - Cows, cowries, coins, paper, gold: Money has always had a physical presence. But today, it is quickly evolving into numbers that float through
the ether. This modern era of money began in 1946 with the first bank-issued charge card. Credit cards followed some 12 years later, still related to dollars. However,
technology, with cryptocurrencies like Bitcoin, is changing the world’s definition of “money.”
Now, social media companies and entire countries are considering digital currencies of their own. Meanwhile, artificial intelligence is growing eversmarter, and perhaps
one day soon your budget and expenses will be managing themselves. The debate rages about exactly where we are headed, but with history as our guide, the one thing
we can absolutely count on is the inevitability of change.
In the paragraph of the text, the author mentions that paper money first circulated in
a) Italy
b) China
c) England
d) Australia
e) Germany
www.tecconcursos.com.br/questoes/2400853
By Rich Beattie
What we call “money” has always been a moving target. It changes appearance and value. Here are five key developments in the history of money that have impacted how
we earn, save and spend today.
Cash Cows - Before humans had money, they had stuff. In ancient times, when you had stuff other people wanted, you bartered it for stuff you wanted.
Around 9000 BC, the most popular commodities included things like cattle, sheep and camels. This was fine when people bartered close to home, but bulky creatures are
cumbersome and difficult to transport. As people started to venture farther afield to trade, a more portable option became essential.
Three Coins in the Fountain - The issue with bartering became assigning value: Just how much was a cowrie or a cow worth? So, agreeing on the value of money
became essential. It was the Lydians, around 600 BC, who get credit for a critical step in this process: fashioning the first known coins, which were made of a gold and
silver alloy.
The metal used to make a coin—along with its weight—was important, as it denoted the money’s value. Moreover, as coins gained popularity, so did the idea of adorning
them with locally inspired designs.
Coins were money, but they now doubled as a historic record. Eventually, they took on even more uses: People flipped them to make decisions and tossed them into wells
while making wishes. They may be used less in 2020, but coins have been an integral part of our culture for centuries.
The Paper Chase - Coins were obviously lighter and easier to transport than cows, but carrying bags of heavy metal still wasn’t very practical. China’s Tang Dynasty, in
the seventh century, came up with a smart solution, namely, paper money. It was super-light and could feature even richer designs than coins, and it promised a certain
amount of purchasing power.
Gold Rush - One of the problems, though, was that counterfeiters had great success with paper bills.
The bigger problem came when governments faced economic crises; it was far too easy to print more paper money, which led to skyrocketing inflation.
Paper needed a backup—something universally valued yet not easily replicated. Something like gold.
The “gold standard” let governments create a fixed price for this precious metal that was tied directly to the value of their currency. In the United States, the idea took root
in the late 17th century, and it spread to Europe in the 19th century. But confidence in the gold standard crumbled during World War I, and it soon became apparent that in
order to thrive, currencies needed the freedom to fluctuate dynamically against each other. The gold standard was dropped in the United States in 1933, and a global
economy started to take shape.
The E-Buck Stops Here? - Cows, cowries, coins, paper, gold: Money has always had a physical presence. But today, it is quickly evolving into numbers that float through
the ether. This modern era of money began in 1946 with the first bank-issued charge card. Credit cards followed some 12 years later, still related to dollars. However,
technology, with cryptocurrencies like Bitcoin, is changing the world’s definition of “money.”
Now, social media companies and entire countries are considering digital currencies of their own. Meanwhile, artificial intelligence is growing eversmarter, and perhaps
one day soon your budget and expenses will be managing themselves. The debate rages about exactly where we are headed, but with history as our guide, the one thing
we can absolutely count on is the inevitability of change.
In the paragraph, the author of the text predicts that there are chances artificial intelligence will be able to
www.tecconcursos.com.br/questoes/2689165
The shipping trends play a vital role in global trade, transporting goods worth trillions of dollars yearly. Population growth and continued urbanization will also lead to an
increase in demand for maritime shipping services. The maritime shipping industry must continue to innovate and adopt new technologies to meet this increased demand.
The following are some of the most promising trends and innovations currently taking place in the maritime shipping industry:
Green Technology - One of the most critical trends in maritime shipping is the move toward green technology. With increasing public awareness of the need to protect the
environment, it is becoming increasingly crucial for maritime companies to adopt green practices. Maritime companies invest in cleaner-burning fuels such as LNG
(liquefied natural gas). LNG produces significantly lower emissions than traditional marine fuels such as heavy fuel oil (HFO) and diesel. Some maritime companies are also
experimenting with battery-powered ships to reduce emissions further. While battery-powered ships are not yet commercially viable on long voyages, they show great
promise for use on shorter routes.
Electric Ships - Global maritime transport emits around 900 million tons of carbon dioxide annually, accounting for 2-3% of the world’s total emissions. As the push for
decarbonization gathers momentum, it is only a matter of time before electric ships become the norm.
Autonomous Ships - Another exciting trend in maritime shipping is the development of autonomous ships. Autonomous ships have the potential to revolutionize the
industry. They offer many advantages over traditional vessels, including reduced operating costs, increased efficiency, and improved safety by reducing the need for manual
labor onboard ships. In addition, automated systems are less susceptible to human error than their manual counterparts. While there are many regulatory hurdles to
overcome before autonomous vessels can be deployed commercially, they are expected to eventually become a common sight in the world’s oceans.
Blockchain - Blockchain technology is also beginning to make its way into the maritime shipping industry. Blockchain offers several potential benefits for maritime
companies, including improved tracking of shipments and real-time visibility of their location- this would minimize delays caused by lost or misplaced cargo, reduce
paperwork, and increase transparency throughout the supply chain. Moreover, blockchain-based smart contracts could automate many administrative tasks related to
shipping, such as documentation and billing.
Big data and predictive analytics - Another major trend transforming maritime shipping is the increasing use of big data and predictive analytics. The shipping industry
generates vast amounts of data that can be extremely valuable if analyzed correctly. Big data analytics can improve everything from route planning to fuel consumption. By
harnessing the power of data, shipping companies can optimize their operations, reduce costs, and enhance safety and security. Predictive analytics is particularly valuable
for identifying potential problems before they occur, such as equipment failures or weather hazards.
Cybersecurity - Cybersecurity is a growing concern for maritime companies due to the increased reliance on digital systems and networks. As the shipping industry
becomes increasingly digitized, companies must implement robust cybersecurity measures to protect their vessels and cargo from attack. Ships are now equipped with
everything from satellite communications to remote monitoring capabilities, all of which create potential cyber vulnerabilities.
Conclusion - The maritime shipping news is undergoing a period of significant change, with new technologies and trends emerging that have the potential to revolutionize
www.tecconcursos.com.br/questoes/2689166
The shipping trends play a vital role in global trade, transporting goods worth trillions of dollars yearly. Population growth and continued urbanization will also lead to an
increase in demand for maritime shipping services. The maritime shipping industry must continue to innovate and adopt new technologies to meet this increased demand.
The following are some of the most promising trends and innovations currently taking place in the maritime shipping industry:
Green Technology - One of the most critical trends in maritime shipping is the move toward green technology. With increasing public awareness of the need to protect the
environment, it is becoming increasingly crucial for maritime companies to adopt green practices. Maritime companies invest in cleaner-burning fuels such as LNG
(liquefied natural gas). LNG produces significantly lower emissions than traditional marine fuels such as heavy fuel oil (HFO) and diesel. Some maritime companies are also
experimenting with battery-powered ships to reduce emissions further. While battery-powered ships are not yet commercially viable on long voyages, they show great
promise for use on shorter routes.
Electric Ships - Global maritime transport emits around 900 million tons of carbon dioxide annually, accounting for 2-3% of the world’s total emissions. As the push for
decarbonization gathers momentum, it is only a matter of time before electric ships become the norm.
Autonomous Ships - Another exciting trend in maritime shipping is the development of autonomous ships. Autonomous ships have the potential to revolutionize the
industry. They offer many advantages over traditional vessels, including reduced operating costs, increased efficiency, and improved safety by reducing the need for manual
labor onboard ships. In addition, automated systems are less susceptible to human error than their manual counterparts. While there are many regulatory hurdles to
overcome before autonomous vessels can be deployed commercially, they are expected to eventually become a common sight in the world’s oceans.
Blockchain - Blockchain technology is also beginning to make its way into the maritime shipping industry. Blockchain offers several potential benefits for maritime
companies, including improved tracking of shipments and real-time visibility of their location- this would minimize delays caused by lost or misplaced cargo, reduce
paperwork, and increase transparency throughout the supply chain. Moreover, blockchain-based smart contracts could automate many administrative tasks related to
shipping, such as documentation and billing.
Big data and predictive analytics - Another major trend transforming maritime shipping is the increasing use of big data and predictive analytics. The shipping industry
generates vast amounts of data that can be extremely valuable if analyzed correctly. Big data analytics can improve everything from route planning to fuel consumption. By
harnessing the power of data, shipping companies can optimize their operations, reduce costs, and enhance safety and security. Predictive analytics is particularly valuable
for identifying potential problems before they occur, such as equipment failures or weather hazards.
Cybersecurity - Cybersecurity is a growing concern for maritime companies due to the increased reliance on digital systems and networks. As the shipping industry
becomes increasingly digitized, companies must implement robust cybersecurity measures to protect their vessels and cargo from attack. Ships are now equipped with
everything from satellite communications to remote monitoring capabilities, all of which create potential cyber vulnerabilities.
Conclusion - The maritime shipping news is undergoing a period of significant change, with new technologies and trends emerging that have the potential to revolutionize
the way that we ship goods around the world.
From the second paragraph of the text, one can conclude that green technology can be achieved with
a) HFO
b) LNG
c) diesel
d) greenhouse gas
e) CO2 emission increase
www.tecconcursos.com.br/questoes/2689167
The shipping trends play a vital role in global trade, transporting goods worth trillions of dollars yearly. Population growth and continued urbanization will also lead to an
increase in demand for maritime shipping services. The maritime shipping industry must continue to innovate and adopt new technologies to meet this increased demand.
The following are some of the most promising trends and innovations currently taking place in the maritime shipping industry:
Green Technology - One of the most critical trends in maritime shipping is the move toward green technology. With increasing public awareness of the need to protect the
environment, it is becoming increasingly crucial for maritime companies to adopt green practices. Maritime companies invest in cleaner-burning fuels such as LNG
(liquefied natural gas). LNG produces significantly lower emissions than traditional marine fuels such as heavy fuel oil (HFO) and diesel. Some maritime companies are also
experimenting with battery-powered ships to reduce emissions further. While battery-powered ships are not yet commercially viable on long voyages, they show great
promise for use on shorter routes.
Electric Ships - Global maritime transport emits around 900 million tons of carbon dioxide annually, accounting for 2-3% of the world’s total emissions. As the push for
decarbonization gathers momentum, it is only a matter of time before electric ships become the norm.
Blockchain - Blockchain technology is also beginning to make its way into the maritime shipping industry. Blockchain offers several potential benefits for maritime
companies, including improved tracking of shipments and real-time visibility of their location- this would minimize delays caused by lost or misplaced cargo, reduce
paperwork, and increase transparency throughout the supply chain. Moreover, blockchain-based smart contracts could automate many administrative tasks related to
shipping, such as documentation and billing.
Big data and predictive analytics - Another major trend transforming maritime shipping is the increasing use of big data and predictive analytics. The shipping industry
generates vast amounts of data that can be extremely valuable if analyzed correctly. Big data analytics can improve everything from route planning to fuel consumption. By
harnessing the power of data, shipping companies can optimize their operations, reduce costs, and enhance safety and security. Predictive analytics is particularly valuable
for identifying potential problems before they occur, such as equipment failures or weather hazards.
Cybersecurity - Cybersecurity is a growing concern for maritime companies due to the increased reliance on digital systems and networks. As the shipping industry
becomes increasingly digitized, companies must implement robust cybersecurity measures to protect their vessels and cargo from attack. Ships are now equipped with
everything from satellite communications to remote monitoring capabilities, all of which create potential cyber vulnerabilities.
Conclusion - The maritime shipping news is undergoing a period of significant change, with new technologies and trends emerging that have the potential to revolutionize
the way that we ship goods around the world.
In the fragment in the fourth paragraph of the text “While there are many regulatory hurdles to overcome before autonomous vessels can be deployed commercially, they
are expected to eventually become a common sight in the world’s oceans”, the word they refers to
a) common sight
b) world’s oceans
c) regulatory hurdles
d) many regulations
e) autonomous vessels
www.tecconcursos.com.br/questoes/2689168
The shipping trends play a vital role in global trade, transporting goods worth trillions of dollars yearly. Population growth and continued urbanization will also lead to an
increase in demand for maritime shipping services. The maritime shipping industry must continue to innovate and adopt new technologies to meet this increased demand.
The following are some of the most promising trends and innovations currently taking place in the maritime shipping industry:
Green Technology - One of the most critical trends in maritime shipping is the move toward green technology. With increasing public awareness of the need to protect the
environment, it is becoming increasingly crucial for maritime companies to adopt green practices. Maritime companies invest in cleaner-burning fuels such as LNG
(liquefied natural gas). LNG produces significantly lower emissions than traditional marine fuels such as heavy fuel oil (HFO) and diesel. Some maritime companies are also
experimenting with battery-powered ships to reduce emissions further. While battery-powered ships are not yet commercially viable on long voyages, they show great
promise for use on shorter routes.
Electric Ships - Global maritime transport emits around 900 million tons of carbon dioxide annually, accounting for 2-3% of the world’s total emissions. As the push for
decarbonization gathers momentum, it is only a matter of time before electric ships become the norm.
Autonomous Ships - Another exciting trend in maritime shipping is the development of autonomous ships. Autonomous ships have the potential to revolutionize the
industry. They offer many advantages over traditional vessels, including reduced operating costs, increased efficiency, and improved safety by reducing the need for manual
labor onboard ships. In addition, automated systems are less susceptible to human error than their manual counterparts. While there are many regulatory hurdles to
overcome before autonomous vessels can be deployed commercially, they are expected to eventually become a common sight in the world’s oceans.
Blockchain - Blockchain technology is also beginning to make its way into the maritime shipping industry. Blockchain offers several potential benefits for maritime
companies, including improved tracking of shipments and real-time visibility of their location- this would minimize delays caused by lost or misplaced cargo, reduce
paperwork, and increase transparency throughout the supply chain. Moreover, blockchain-based smart contracts could automate many administrative tasks related to
shipping, such as documentation and billing.
Big data and predictive analytics - Another major trend transforming maritime shipping is the increasing use of big data and predictive analytics. The shipping industry
generates vast amounts of data that can be extremely valuable if analyzed correctly. Big data analytics can improve everything from route planning to fuel consumption. By
harnessing the power of data, shipping companies can optimize their operations, reduce costs, and enhance safety and security. Predictive analytics is particularly valuable
for identifying potential problems before they occur, such as equipment failures or weather hazards.
Cybersecurity - Cybersecurity is a growing concern for maritime companies due to the increased reliance on digital systems and networks. As the shipping industry
becomes increasingly digitized, companies must implement robust cybersecurity measures to protect their vessels and cargo from attack. Ships are now equipped with
everything from satellite communications to remote monitoring capabilities, all of which create potential cyber vulnerabilities.
Conclusion - The maritime shipping news is undergoing a period of significant change, with new technologies and trends emerging that have the potential to revolutionize
the way that we ship goods around the world.
In the fifth paragraph of the text, the author states that “blockchain” is a technology that can
The shipping trends play a vital role in global trade, transporting goods worth trillions of dollars yearly. Population growth and continued urbanization will also lead to an
increase in demand for maritime shipping services. The maritime shipping industry must continue to innovate and adopt new technologies to meet this increased demand.
The following are some of the most promising trends and innovations currently taking place in the maritime shipping industry:
Green Technology - One of the most critical trends in maritime shipping is the move toward green technology. With increasing public awareness of the need to protect the
environment, it is becoming increasingly crucial for maritime companies to adopt green practices. Maritime companies invest in cleaner-burning fuels such as LNG
(liquefied natural gas). LNG produces significantly lower emissions than traditional marine fuels such as heavy fuel oil (HFO) and diesel. Some maritime companies are also
experimenting with battery-powered ships to reduce emissions further. While battery-powered ships are not yet commercially viable on long voyages, they show great
promise for use on shorter routes.
Electric Ships - Global maritime transport emits around 900 million tons of carbon dioxide annually, accounting for 2-3% of the world’s total emissions. As the push for
decarbonization gathers momentum, it is only a matter of time before electric ships become the norm.
Autonomous Ships - Another exciting trend in maritime shipping is the development of autonomous ships. Autonomous ships have the potential to revolutionize the
industry. They offer many advantages over traditional vessels, including reduced operating costs, increased efficiency, and improved safety by reducing the need for manual
labor onboard ships. In addition, automated systems are less susceptible to human error than their manual counterparts. While there are many regulatory hurdles to
overcome before autonomous vessels can be deployed commercially, they are expected to eventually become a common sight in the world’s oceans.
Blockchain - Blockchain technology is also beginning to make its way into the maritime shipping industry. Blockchain offers several potential benefits for maritime
companies, including improved tracking of shipments and real-time visibility of their location- this would minimize delays caused by lost or misplaced cargo, reduce
paperwork, and increase transparency throughout the supply chain. Moreover, blockchain-based smart contracts could automate many administrative tasks related to
shipping, such as documentation and billing.
Big data and predictive analytics - Another major trend transforming maritime shipping is the increasing use of big data and predictive analytics. The shipping industry
generates vast amounts of data that can be extremely valuable if analyzed correctly. Big data analytics can improve everything from route planning to fuel consumption. By
harnessing the power of data, shipping companies can optimize their operations, reduce costs, and enhance safety and security. Predictive analytics is particularly valuable
for identifying potential problems before they occur, such as equipment failures or weather hazards.
Cybersecurity - Cybersecurity is a growing concern for maritime companies due to the increased reliance on digital systems and networks. As the shipping industry
becomes increasingly digitized, companies must implement robust cybersecurity measures to protect their vessels and cargo from attack. Ships are now equipped with
everything from satellite communications to remote monitoring capabilities, all of which create potential cyber vulnerabilities.
Conclusion - The maritime shipping news is undergoing a period of significant change, with new technologies and trends emerging that have the potential to revolutionize
the way that we ship goods around the world.
In the fragment in the fifth paragraph “Moreover, blockchain-based smart contracts could automate”, the word Moreover can be associated with the idea of:
a) time
b) addition
c) condition
d) emphasis
e) opposition
www.tecconcursos.com.br/questoes/2689178
The shipping trends play a vital role in global trade, transporting goods worth trillions of dollars yearly. Population growth and continued urbanization will also lead to an
increase in demand for maritime shipping services. The maritime shipping industry must continue to innovate and adopt new technologies to meet this increased demand.
The following are some of the most promising trends and innovations currently taking place in the maritime shipping industry:
Green Technology - One of the most critical trends in maritime shipping is the move toward green technology. With increasing public awareness of the need to protect the
environment, it is becoming increasingly crucial for maritime companies to adopt green practices. Maritime companies invest in cleaner-burning fuels such as LNG
(liquefied natural gas). LNG produces significantly lower emissions than traditional marine fuels such as heavy fuel oil (HFO) and diesel. Some maritime companies are also
experimenting with battery-powered ships to reduce emissions further. While battery-powered ships are not yet commercially viable on long voyages, they show great
promise for use on shorter routes.
Electric Ships - Global maritime transport emits around 900 million tons of carbon dioxide annually, accounting for 2-3% of the world’s total emissions. As the push for
decarbonization gathers momentum, it is only a matter of time before electric ships become the norm.
Autonomous Ships - Another exciting trend in maritime shipping is the development of autonomous ships. Autonomous ships have the potential to revolutionize the
industry. They offer many advantages over traditional vessels, including reduced operating costs, increased efficiency, and improved safety by reducing the need for manual
labor onboard ships. In addition, automated systems are less susceptible to human error than their manual counterparts. While there are many regulatory hurdles to
overcome before autonomous vessels can be deployed commercially, they are expected to eventually become a common sight in the world’s oceans.
Blockchain - Blockchain technology is also beginning to make its way into the maritime shipping industry. Blockchain offers several potential benefits for maritime
companies, including improved tracking of shipments and real-time visibility of their location- this would minimize delays caused by lost or misplaced cargo, reduce
paperwork, and increase transparency throughout the supply chain. Moreover, blockchain-based smart contracts could automate many administrative tasks related to
shipping, such as documentation and billing.
Big data and predictive analytics - Another major trend transforming maritime shipping is the increasing use of big data and predictive analytics. The shipping industry
generates vast amounts of data that can be extremely valuable if analyzed correctly. Big data analytics can improve everything from route planning to fuel consumption. By
harnessing the power of data, shipping companies can optimize their operations, reduce costs, and enhance safety and security. Predictive analytics is particularly valuable
for identifying potential problems before they occur, such as equipment failures or weather hazards.
Conclusion - The maritime shipping news is undergoing a period of significant change, with new technologies and trends emerging that have the potential to revolutionize
the way that we ship goods around the world.
In the eighth paragraph of the text, the author states that new technology will
a) contribute to world wars.
b) privilege a few companies.
c) keep operations as they are.
d) aggravate the oceans’ pollution.
e) revolutionize shipping operations.
www.tecconcursos.com.br/questoes/2689180
The shipping trends play a vital role in global trade, transporting goods worth trillions of dollars yearly. Population growth and continued urbanization will also lead to an
increase in demand for maritime shipping services. The maritime shipping industry must continue to innovate and adopt new technologies to meet this increased demand.
The following are some of the most promising trends and innovations currently taking place in the maritime shipping industry:
Green Technology - One of the most critical trends in maritime shipping is the move toward green technology. With increasing public awareness of the need to protect the
environment, it is becoming increasingly crucial for maritime companies to adopt green practices. Maritime companies invest in cleaner-burning fuels such as LNG
(liquefied natural gas). LNG produces significantly lower emissions than traditional marine fuels such as heavy fuel oil (HFO) and diesel. Some maritime companies are also
experimenting with battery-powered ships to reduce emissions further. While battery-powered ships are not yet commercially viable on long voyages, they show great
promise for use on shorter routes.
Electric Ships - Global maritime transport emits around 900 million tons of carbon dioxide annually, accounting for 2-3% of the world’s total emissions. As the push for
decarbonization gathers momentum, it is only a matter of time before electric ships become the norm.
Autonomous Ships - Another exciting trend in maritime shipping is the development of autonomous ships. Autonomous ships have the potential to revolutionize the
industry. They offer many advantages over traditional vessels, including reduced operating costs, increased efficiency, and improved safety by reducing the need for manual
labor onboard ships. In addition, automated systems are less susceptible to human error than their manual counterparts. While there are many regulatory hurdles to
overcome before autonomous vessels can be deployed commercially, they are expected to eventually become a common sight in the world’s oceans.
Blockchain - Blockchain technology is also beginning to make its way into the maritime shipping industry. Blockchain offers several potential benefits for maritime
companies, including improved tracking of shipments and real-time visibility of their location- this would minimize delays caused by lost or misplaced cargo, reduce
paperwork, and increase transparency throughout the supply chain. Moreover, blockchain-based smart contracts could automate many administrative tasks related to
shipping, such as documentation and billing.
Big data and predictive analytics - Another major trend transforming maritime shipping is the increasing use of big data and predictive analytics. The shipping industry
generates vast amounts of data that can be extremely valuable if analyzed correctly. Big data analytics can improve everything from route planning to fuel consumption. By
harnessing the power of data, shipping companies can optimize their operations, reduce costs, and enhance safety and security. Predictive analytics is particularly valuable
for identifying potential problems before they occur, such as equipment failures or weather hazards.
Cybersecurity - Cybersecurity is a growing concern for maritime companies due to the increased reliance on digital systems and networks. As the shipping industry
becomes increasingly digitized, companies must implement robust cybersecurity measures to protect their vessels and cargo from attack. Ships are now equipped with
everything from satellite communications to remote monitoring capabilities, all of which create potential cyber vulnerabilities.
Conclusion - The maritime shipping news is undergoing a period of significant change, with new technologies and trends emerging that have the potential to revolutionize
the way that we ship goods around the world.
The vessel that is NOT adequate to the mentioned cargo transport is:
www.tecconcursos.com.br/questoes/2689818
We live in vulnerable energy times. The energy crisis, climate change and energy transition are all shaking and shaping the global future. “The energy realities of the world
remind us that oil and gas will be here for decades to pivot a just, affordable and secure energy transition,” as John Hess, CEO of Hess Corporation, mentioned during the
International Energy Conference and Expo in Guyana in February 2023.
As someone said, vulnerability is the birthplace of innovation and technology is the driving force behind progressive changes. Nevertheless, how can Guyana play a vital
role in reordering energy security? “By embedding innovation earlier in the process, Guyana can skip several steps and avoid what most economies went through” this idea
was emphasized several times during the same conference. “If we integrate innovation into Guyana’s process today, there might be some accelerated success.”
Guyana can play an essential role in balancing the global energy supply and demand markets and address the energy crisis by becoming a top crude oil producer globally.
The goal is to become competitive in the global oil and gas market and this can be achieved by attracting and establishing partnerships with companies that can bring
increased efficiency and productivity to the local oil and gas operations, from exploration and production to storage and transportation. For Guyana, this means that
improvements in regulations, a transparent, secure and competitive environment for foreign investment, and incentives from the government can serve as catalysts for
technology and innovation.
Collaborating with universities and creating a business innovation hub mentality for young entrepreneurs with government support, like loan guarantees, grants, and tax
credits, will also spur the industry.
Innovative technology will play a critical role in climate change. The oil and gas sector must reduce its emissions by at least 3.4 gigatons of CO2 equivalent a year by 2050
– a 90 % reduction in current emissions. Guyana today can become a world leader in setting a benchmark around flaring and it’s possible for the country to achieve zero-
flare objective, because “from day one the right solutions and the right technologies were properly planned and properly positioned in order to enable the extraction and
the production with almost zero carbon footprint”, as the Emissions Director at Schlumberger vocalized about a year ago.
Innovations and technologies are key to the energy transition, from floating wind farms to solar photovoltaic farm developments, waste-to-fuel projects and green
hydrogen, shaping Guyana’s energy transition and future. All this requires not only massive financial support but an innovationoriented and technology-friendly
environment, with a strong emphasis on education, training and research. Nevertheless, the decision in Guyana on what technologies to adopt and how much to innovate
will have a big impact on results over the long term and the government should base it on a clear vision and roadmap.
The fragment of paragraph 2 “vulnerability is the birthplace of innovation” means that vulnerability
www.tecconcursos.com.br/questoes/2689916
We live in vulnerable energy times. The energy crisis, climate change and energy transition are all shaking and shaping the global future. “The energy realities of the world
remind us that oil and gas will be here for decades to pivot a just, affordable and secure energy transition,” as John Hess, CEO of Hess Corporation, mentioned during the
International Energy Conference and Expo in Guyana in February 2023.
As someone said, vulnerability is the birthplace of innovation and technology is the driving force behind progressive changes. Nevertheless, how can Guyana play a vital
role in reordering energy security? “By embedding innovation earlier in the process, Guyana can skip several steps and avoid what most economies went through” this idea
was emphasized several times during the same conference. “If we integrate innovation into Guyana’s process today, there might be some accelerated success.”
Guyana can play an essential role in balancing the global energy supply and demand markets and address the energy crisis by becoming a top crude oil producer globally.
The goal is to become competitive in the global oil and gas market and this can be achieved by attracting and establishing partnerships with companies that can bring
increased efficiency and productivity to the local oil and gas operations, from exploration and production to storage and transportation. For Guyana, this means that
improvements in regulations, a transparent, secure and competitive environment for foreign investment, and incentives from the government can serve as catalysts for
technology and innovation.
Collaborating with universities and creating a business innovation hub mentality for young entrepreneurs with government support, like loan guarantees, grants, and tax
credits, will also spur the industry.
Innovative technology will play a critical role in climate change. The oil and gas sector must reduce its emissions by at least 3.4 gigatons of CO2 equivalent a year by 2050
– a 90 % reduction in current emissions. Guyana today can become a world leader in setting a benchmark around flaring and it’s possible for the country to achieve zero-
flare objective, because “from day one the right solutions and the right technologies were properly planned and properly positioned in order to enable the extraction and
the production with almost zero carbon footprint”, as the Emissions Director at Schlumberger vocalized about a year ago.
Innovations and technologies are key to the energy transition, from floating wind farms to solar photovoltaic farm developments, waste-to-fuel projects and green
hydrogen, shaping Guyana’s energy transition and future. All this requires not only massive financial support but an innovationoriented and technology-friendly
environment, with a strong emphasis on education, training and research. Nevertheless, the decision in Guyana on what technologies to adopt and how much to innovate
will have a big impact on results over the long term and the government should base it on a clear vision and roadmap.
The segment of paragraph 2 “technology is the driving force behind progressive changes” means that technology
a) produces progressive changes.
b) reduces progressive changes.
c) prevents progressive changes.
d) slows down progressive changes.
e) complicates progressive changes.
www.tecconcursos.com.br/questoes/2689939
As someone said, vulnerability is the birthplace of innovation and technology is the driving force behind progressive changes. Nevertheless, how can Guyana play a vital
role in reordering energy security? “By embedding innovation earlier in the process, Guyana can skip several steps and avoid what most economies went through” this idea
was emphasized several times during the same conference. “If we integrate innovation into Guyana’s process today, there might be some accelerated success.”
Guyana can play an essential role in balancing the global energy supply and demand markets and address the energy crisis by becoming a top crude oil producer globally.
The goal is to become competitive in the global oil and gas market and this can be achieved by attracting and establishing partnerships with companies that can bring
increased efficiency and productivity to the local oil and gas operations, from exploration and production to storage and transportation. For Guyana, this means that
improvements in regulations, a transparent, secure and competitive environment for foreign investment, and incentives from the government can serve as catalysts for
technology and innovation.
Collaborating with universities and creating a business innovation hub mentality for young entrepreneurs with government support, like loan guarantees, grants, and tax
credits, will also spur the industry.
Innovative technology will play a critical role in climate change. The oil and gas sector must reduce its emissions by at least 3.4 gigatons of CO2 equivalent a year by 2050
– a 90 % reduction in current emissions. Guyana today can become a world leader in setting a benchmark around flaring and it’s possible for the country to achieve zero-
flare objective, because “from day one the right solutions and the right technologies were properly planned and properly positioned in order to enable the extraction and
the production with almost zero carbon footprint”, as the Emissions Director at Schlumberger vocalized about a year ago.
Innovations and technologies are key to the energy transition, from floating wind farms to solar photovoltaic farm developments, waste-to-fuel projects and green
hydrogen, shaping Guyana’s energy transition and future. All this requires not only massive financial support but an innovationoriented and technology-friendly
environment, with a strong emphasis on education, training and research. Nevertheless, the decision in Guyana on what technologies to adopt and how much to innovate
will have a big impact on results over the long term and the government should base it on a clear vision and roadmap.
In the section of paragraph 6 “the government should base it on a clear vision and roadmap”, the modal verb should indicates
a) a desirable action
b) a thoughtless action
c) an unpredictable action
d) an arbitrary action
e) a dynamic action
www.tecconcursos.com.br/questoes/2693658
Space technology is developing fast, and, with every advance, it is becoming more accessible to industry. Today, satellite communications (satcoms) and space-based data
are underpinning new ways of operating that boost both sustainability and profitability. Some projects are still in the planning stages, offering great promise for the future.
However, others are already delivering practical results.
The benefits of space technology broadly fall into two categories: connectivity that can reach into situations where terrestrial technologies struggle to deliver and the deep,
unique insights delivered by Earth Observation (EO) data. Both depend on access to satellite networks, particularly medium earth orbit (MEO) and low earth orbit (LEO)
satellites that offer low-latency connectivity and frequently updated data. Right now, the satellite supplier market is booming, driving down the cost of access to satellites.
Suppliers are increasingly tailoring their services to emerging customer needs and the potential applications are incredible – as a look at the transportation sector shows.
Satellite technology is a critical part of revolutionizing connectivity on trains. The Satellites for Digitalization of Railways (SODOR) project will provide low latency, highly
reliable connectivity that, combined with monitoring sensors, will mean near realtime data guides operational decisions. This insight will help trains run more efficiently
with fewer delays for passengers. Launching this year, SODOR will help operators reduce emissions by using the network more efficiently, allowing preventative
maintenance and extending the lifetime of some existing trains. It will also make rail travel more attractive and help shift more passengers from road to rail (that typically
emits even less CO2 per passenger than electric cars do).
Satellite data and communications will also play a fundamental role in shaping a sustainable future for road vehicles. Right now, the transport sector contributes around
14% of the UK’s greenhouse gas emissions, of which 91% is from road vehicles – and this needs to change.
A future where Electric Vehicles (EV) dominate will need a smart infrastructure to monitor and control the electricity network, managing highly variable supply and
demand, as well as a large network of EV charging points. EO data will be critical in future forecasting models for wind and solar production, to help manage a consistent
flow of green energy.
Satellite communications will also be pivotal. As more wind and solar installations join the electricity network – often in remote locations – satcoms will step in to deliver
highly reliable connectivity where 4G struggles to reach. It will underpin a growing network of EV charging points, connecting each point to the internet for operational
management purposes, for billing and access app functionality and for the users’ comfort, they may access the system wherever they are.
Satellite technology will increasingly be a part of the vehicles themselves, particularly when automated driving becomes more mainstream. It will be essential for every
vehicle to have continuous connectivity to support real-time software patches, map updates and inter-vehicle communications. Already, satellites provide regular software
updates to vehicles and enhanced safety through an in-car emergency call service.
At our company, we have been deeply embedded in the space engineering for more than 40 years – and we continue to be involved with the state-of-the-art technologies
and use cases. We have a strong track record of translating these advances into practical benefits for our customers that make sense on both a business and a
sustainability level.
www.tecconcursos.com.br/questoes/2693671
Space technology is developing fast, and, with every advance, it is becoming more accessible to industry. Today, satellite communications (satcoms) and space-based data
are underpinning new ways of operating that boost both sustainability and profitability. Some projects are still in the planning stages, offering great promise for the future.
However, others are already delivering practical results.
The benefits of space technology broadly fall into two categories: connectivity that can reach into situations where terrestrial technologies struggle to deliver and the deep,
unique insights delivered by Earth Observation (EO) data. Both depend on access to satellite networks, particularly medium earth orbit (MEO) and low earth orbit (LEO)
satellites that offer low-latency connectivity and frequently updated data. Right now, the satellite supplier market is booming, driving down the cost of access to satellites.
Suppliers are increasingly tailoring their services to emerging customer needs and the potential applications are incredible – as a look at the transportation sector shows.
Satellite technology is a critical part of revolutionizing connectivity on trains. The Satellites for Digitalization of Railways (SODOR) project will provide low latency, highly
reliable connectivity that, combined with monitoring sensors, will mean near realtime data guides operational decisions. This insight will help trains run more efficiently
with fewer delays for passengers. Launching this year, SODOR will help operators reduce emissions by using the network more efficiently, allowing preventative
maintenance and extending the lifetime of some existing trains. It will also make rail travel more attractive and help shift more passengers from road to rail (that typically
emits even less CO2 per passenger than electric cars do).
Satellite data and communications will also play a fundamental role in shaping a sustainable future for road vehicles. Right now, the transport sector contributes around
14% of the UK’s greenhouse gas emissions, of which 91% is from road vehicles – and this needs to change.
A future where Electric Vehicles (EV) dominate will need a smart infrastructure to monitor and control the electricity network, managing highly variable supply and
demand, as well as a large network of EV charging points. EO data will be critical in future forecasting models for wind and solar production, to help manage a consistent
flow of green energy.
Satellite communications will also be pivotal. As more wind and solar installations join the electricity network – often in remote locations – satcoms will step in to deliver
highly reliable connectivity where 4G struggles to reach. It will underpin a growing network of EV charging points, connecting each point to the internet for operational
management purposes, for billing and access app functionality and for the users’ comfort, they may access the system wherever they are.
Satellite technology will increasingly be a part of the vehicles themselves, particularly when automated driving becomes more mainstream. It will be essential for every
vehicle to have continuous connectivity to support real-time software patches, map updates and inter-vehicle communications. Already, satellites provide regular software
updates to vehicles and enhanced safety through an in-car emergency call service.
At our company, we have been deeply embedded in the space engineering for more than 40 years – and we continue to be involved with the state-of-the-art technologies
and use cases. We have a strong track record of translating these advances into practical benefits for our customers that make sense on both a business and a
sustainability level.
From the fragment in the second paragraph of the text “connectivity that can reach into situations where terrestrial technologies struggle to deliver”, it can be concluded
that terrestrial technologies can present data problems related to their
a) price
b) safety
c) choice
d) marketing
e) transmission
www.tecconcursos.com.br/questoes/2693673
Space technology is developing fast, and, with every advance, it is becoming more accessible to industry. Today, satellite communications (satcoms) and space-based data
are underpinning new ways of operating that boost both sustainability and profitability. Some projects are still in the planning stages, offering great promise for the future.
However, others are already delivering practical results.
The benefits of space technology broadly fall into two categories: connectivity that can reach into situations where terrestrial technologies struggle to deliver and the deep,
unique insights delivered by Earth Observation (EO) data. Both depend on access to satellite networks, particularly medium earth orbit (MEO) and low earth orbit (LEO)
satellites that offer low-latency connectivity and frequently updated data. Right now, the satellite supplier market is booming, driving down the cost of access to satellites.
Suppliers are increasingly tailoring their services to emerging customer needs and the potential applications are incredible – as a look at the transportation sector shows.
Satellite technology is a critical part of revolutionizing connectivity on trains. The Satellites for Digitalization of Railways (SODOR) project will provide low latency, highly
reliable connectivity that, combined with monitoring sensors, will mean near realtime data guides operational decisions. This insight will help trains run more efficiently
with fewer delays for passengers. Launching this year, SODOR will help operators reduce emissions by using the network more efficiently, allowing preventative
maintenance and extending the lifetime of some existing trains. It will also make rail travel more attractive and help shift more passengers from road to rail (that typically
emits even less CO2 per passenger than electric cars do).
Satellite data and communications will also play a fundamental role in shaping a sustainable future for road vehicles. Right now, the transport sector contributes around
14% of the UK’s greenhouse gas emissions, of which 91% is from road vehicles – and this needs to change.
A future where Electric Vehicles (EV) dominate will need a smart infrastructure to monitor and control the electricity network, managing highly variable supply and
demand, as well as a large network of EV charging points. EO data will be critical in future forecasting models for wind and solar production, to help manage a consistent
Satellite communications will also be pivotal. As more wind and solar installations join the electricity network – often in remote locations – satcoms will step in to deliver
highly reliable connectivity where 4G struggles to reach. It will underpin a growing network of EV charging points, connecting each point to the internet for operational
management purposes, for billing and access app functionality and for the users’ comfort, they may access the system wherever they are.
Satellite technology will increasingly be a part of the vehicles themselves, particularly when automated driving becomes more mainstream. It will be essential for every
vehicle to have continuous connectivity to support real-time software patches, map updates and inter-vehicle communications. Already, satellites provide regular software
updates to vehicles and enhanced safety through an in-car emergency call service.
At our company, we have been deeply embedded in the space engineering for more than 40 years – and we continue to be involved with the state-of-the-art technologies
and use cases. We have a strong track record of translating these advances into practical benefits for our customers that make sense on both a business and a
sustainability level.
From the fragment in the second paragraph of the text “Right now, the satellite supplier market is booming, driving down the cost of access to satellites”, one can infer that
the more access to the satellite supplier market is feasible,
a) the lower its price will be.
b) the higher its price will be.
c) the better its quality will be.
d) the poorer its quality will be.
e) the more reliable its quality will be.
www.tecconcursos.com.br/questoes/2693680
Space technology is developing fast, and, with every advance, it is becoming more accessible to industry. Today, satellite communications (satcoms) and space-based data
are underpinning new ways of operating that boost both sustainability and profitability. Some projects are still in the planning stages, offering great promise for the future.
However, others are already delivering practical results.
The benefits of space technology broadly fall into two categories: connectivity that can reach into situations where terrestrial technologies struggle to deliver and the deep,
unique insights delivered by Earth Observation (EO) data. Both depend on access to satellite networks, particularly medium earth orbit (MEO) and low earth orbit (LEO)
satellites that offer low-latency connectivity and frequently updated data. Right now, the satellite supplier market is booming, driving down the cost of access to satellites.
Suppliers are increasingly tailoring their services to emerging customer needs and the potential applications are incredible – as a look at the transportation sector shows.
Satellite technology is a critical part of revolutionizing connectivity on trains. The Satellites for Digitalization of Railways (SODOR) project will provide low latency, highly
reliable connectivity that, combined with monitoring sensors, will mean near realtime data guides operational decisions. This insight will help trains run more efficiently
with fewer delays for passengers. Launching this year, SODOR will help operators reduce emissions by using the network more efficiently, allowing preventative
maintenance and extending the lifetime of some existing trains. It will also make rail travel more attractive and help shift more passengers from road to rail (that typically
emits even less CO2 per passenger than electric cars do).
Satellite data and communications will also play a fundamental role in shaping a sustainable future for road vehicles. Right now, the transport sector contributes around
14% of the UK’s greenhouse gas emissions, of which 91% is from road vehicles – and this needs to change.
A future where Electric Vehicles (EV) dominate will need a smart infrastructure to monitor and control the electricity network, managing highly variable supply and
demand, as well as a large network of EV charging points. EO data will be critical in future forecasting models for wind and solar production, to help manage a consistent
flow of green energy.
Satellite communications will also be pivotal. As more wind and solar installations join the electricity network – often in remote locations – satcoms will step in to deliver
highly reliable connectivity where 4G struggles to reach. It will underpin a growing network of EV charging points, connecting each point to the internet for operational
management purposes, for billing and access app functionality and for the users’ comfort, they may access the system wherever they are.
Satellite technology will increasingly be a part of the vehicles themselves, particularly when automated driving becomes more mainstream. It will be essential for every
vehicle to have continuous connectivity to support real-time software patches, map updates and inter-vehicle communications. Already, satellites provide regular software
updates to vehicles and enhanced safety through an in-car emergency call service.
At our company, we have been deeply embedded in the space engineering for more than 40 years – and we continue to be involved with the state-of-the-art technologies
and use cases. We have a strong track record of translating these advances into practical benefits for our customers that make sense on both a business and a
sustainability level.
The fragment in the third paragraph of the text “The Satellites for Digitalization of Railways (SODOR) project will provide low latency” means that
a) low volume of data will be conveyed within hours.
b) low volume of data will be interrupted for a few minutes.
c) low volume of data will be communicated within minutes.
d) high volume of data will be transmitted with minimal delay.
e) high volume of data will be transferred after a few minutes.
www.tecconcursos.com.br/questoes/2693688
Space technology is developing fast, and, with every advance, it is becoming more accessible to industry. Today, satellite communications (satcoms) and space-based data
are underpinning new ways of operating that boost both sustainability and profitability. Some projects are still in the planning stages, offering great promise for the future.
However, others are already delivering practical results.
The benefits of space technology broadly fall into two categories: connectivity that can reach into situations where terrestrial technologies struggle to deliver and the deep,
unique insights delivered by Earth Observation (EO) data. Both depend on access to satellite networks, particularly medium earth orbit (MEO) and low earth orbit (LEO)
satellites that offer low-latency connectivity and frequently updated data. Right now, the satellite supplier market is booming, driving down the cost of access to satellites.
Suppliers are increasingly tailoring their services to emerging customer needs and the potential applications are incredible – as a look at the transportation sector shows.
Satellite technology is a critical part of revolutionizing connectivity on trains. The Satellites for Digitalization of Railways (SODOR) project will provide low latency, highly
reliable connectivity that, combined with monitoring sensors, will mean near realtime data guides operational decisions. This insight will help trains run more efficiently
with fewer delays for passengers. Launching this year, SODOR will help operators reduce emissions by using the network more efficiently, allowing preventative
maintenance and extending the lifetime of some existing trains. It will also make rail travel more attractive and help shift more passengers from road to rail (that typically
emits even less CO2 per passenger than electric cars do).
Satellite data and communications will also play a fundamental role in shaping a sustainable future for road vehicles. Right now, the transport sector contributes around
14% of the UK’s greenhouse gas emissions, of which 91% is from road vehicles – and this needs to change.
A future where Electric Vehicles (EV) dominate will need a smart infrastructure to monitor and control the electricity network, managing highly variable supply and
demand, as well as a large network of EV charging points. EO data will be critical in future forecasting models for wind and solar production, to help manage a consistent
flow of green energy.
Satellite communications will also be pivotal. As more wind and solar installations join the electricity network – often in remote locations – satcoms will step in to deliver
highly reliable connectivity where 4G struggles to reach. It will underpin a growing network of EV charging points, connecting each point to the internet for operational
management purposes, for billing and access app functionality and for the users’ comfort, they may access the system wherever they are.
Satellite technology will increasingly be a part of the vehicles themselves, particularly when automated driving becomes more mainstream. It will be essential for every
vehicle to have continuous connectivity to support real-time software patches, map updates and inter-vehicle communications. Already, satellites provide regular software
updates to vehicles and enhanced safety through an in-car emergency call service.
At our company, we have been deeply embedded in the space engineering for more than 40 years – and we continue to be involved with the state-of-the-art technologies
and use cases. We have a strong track record of translating these advances into practical benefits for our customers that make sense on both a business and a
sustainability level.
From the fifth paragraph of the text, one can infer that models for wind and solar production can provide sources of
a) unreliable power
b) intermittent energy
c) constant power flow
d) scarce energy sources
e) dangerous power sources
www.tecconcursos.com.br/questoes/2693693
Space technology is developing fast, and, with every advance, it is becoming more accessible to industry. Today, satellite communications (satcoms) and space-based data
are underpinning new ways of operating that boost both sustainability and profitability. Some projects are still in the planning stages, offering great promise for the future.
However, others are already delivering practical results.
The benefits of space technology broadly fall into two categories: connectivity that can reach into situations where terrestrial technologies struggle to deliver and the deep,
unique insights delivered by Earth Observation (EO) data. Both depend on access to satellite networks, particularly medium earth orbit (MEO) and low earth orbit (LEO)
satellites that offer low-latency connectivity and frequently updated data. Right now, the satellite supplier market is booming, driving down the cost of access to satellites.
Suppliers are increasingly tailoring their services to emerging customer needs and the potential applications are incredible – as a look at the transportation sector shows.
Satellite technology is a critical part of revolutionizing connectivity on trains. The Satellites for Digitalization of Railways (SODOR) project will provide low latency, highly
reliable connectivity that, combined with monitoring sensors, will mean near realtime data guides operational decisions. This insight will help trains run more efficiently
with fewer delays for passengers. Launching this year, SODOR will help operators reduce emissions by using the network more efficiently, allowing preventative
maintenance and extending the lifetime of some existing trains. It will also make rail travel more attractive and help shift more passengers from road to rail (that typically
emits even less CO2 per passenger than electric cars do).
Satellite data and communications will also play a fundamental role in shaping a sustainable future for road vehicles. Right now, the transport sector contributes around
14% of the UK’s greenhouse gas emissions, of which 91% is from road vehicles – and this needs to change.
A future where Electric Vehicles (EV) dominate will need a smart infrastructure to monitor and control the electricity network, managing highly variable supply and
demand, as well as a large network of EV charging points. EO data will be critical in future forecasting models for wind and solar production, to help manage a consistent
flow of green energy.
Satellite communications will also be pivotal. As more wind and solar installations join the electricity network – often in remote locations – satcoms will step in to deliver
highly reliable connectivity where 4G struggles to reach. It will underpin a growing network of EV charging points, connecting each point to the internet for operational
management purposes, for billing and access app functionality and for the users’ comfort, they may access the system wherever they are.
Satellite technology will increasingly be a part of the vehicles themselves, particularly when automated driving becomes more mainstream. It will be essential for every
vehicle to have continuous connectivity to support real-time software patches, map updates and inter-vehicle communications. Already, satellites provide regular software
updates to vehicles and enhanced safety through an in-car emergency call service.
At our company, we have been deeply embedded in the space engineering for more than 40 years – and we continue to be involved with the state-of-the-art technologies
and use cases. We have a strong track record of translating these advances into practical benefits for our customers that make sense on both a business and a
From the seventh paragraph of the text, one can infer that automated driving will have the benefits of
a) human drivers
b) space technology
c) terrestrial connectivity
d) traffic controlled by people
e) 20th century designed cars
www.tecconcursos.com.br/questoes/2693695
Space technology is developing fast, and, with every advance, it is becoming more accessible to industry. Today, satellite communications (satcoms) and space-based data
are underpinning new ways of operating that boost both sustainability and profitability. Some projects are still in the planning stages, offering great promise for the future.
However, others are already delivering practical results.
The benefits of space technology broadly fall into two categories: connectivity that can reach into situations where terrestrial technologies struggle to deliver and the deep,
unique insights delivered by Earth Observation (EO) data. Both depend on access to satellite networks, particularly medium earth orbit (MEO) and low earth orbit (LEO)
satellites that offer low-latency connectivity and frequently updated data. Right now, the satellite supplier market is booming, driving down the cost of access to satellites.
Suppliers are increasingly tailoring their services to emerging customer needs and the potential applications are incredible – as a look at the transportation sector shows.
Satellite technology is a critical part of revolutionizing connectivity on trains. The Satellites for Digitalization of Railways (SODOR) project will provide low latency, highly
reliable connectivity that, combined with monitoring sensors, will mean near realtime data guides operational decisions. This insight will help trains run more efficiently
with fewer delays for passengers. Launching this year, SODOR will help operators reduce emissions by using the network more efficiently, allowing preventative
maintenance and extending the lifetime of some existing trains. It will also make rail travel more attractive and help shift more passengers from road to rail (that typically
emits even less CO2 per passenger than electric cars do).
Satellite data and communications will also play a fundamental role in shaping a sustainable future for road vehicles. Right now, the transport sector contributes around
14% of the UK’s greenhouse gas emissions, of which 91% is from road vehicles – and this needs to change.
A future where Electric Vehicles (EV) dominate will need a smart infrastructure to monitor and control the electricity network, managing highly variable supply and
demand, as well as a large network of EV charging points. EO data will be critical in future forecasting models for wind and solar production, to help manage a consistent
flow of green energy.
Satellite communications will also be pivotal. As more wind and solar installations join the electricity network – often in remote locations – satcoms will step in to deliver
highly reliable connectivity where 4G struggles to reach. It will underpin a growing network of EV charging points, connecting each point to the internet for operational
management purposes, for billing and access app functionality and for the users’ comfort, they may access the system wherever they are.
Satellite technology will increasingly be a part of the vehicles themselves, particularly when automated driving becomes more mainstream. It will be essential for every
vehicle to have continuous connectivity to support real-time software patches, map updates and inter-vehicle communications. Already, satellites provide regular software
updates to vehicles and enhanced safety through an in-car emergency call service.
At our company, we have been deeply embedded in the space engineering for more than 40 years – and we continue to be involved with the state-of-the-art technologies
and use cases. We have a strong track record of translating these advances into practical benefits for our customers that make sense on both a business and a
sustainability level.
In the eighth paragraph of the text, the author states that, for the last 40 years, the company where he works has been
a) embedded in antipollution laws.
b) dedicated to space travel medicine.
c) involved with cutting-edge space industry.
d) concerned with the Earth’s polar ice caps.
e) engaged in antinuclear weapon campaigns.
www.tecconcursos.com.br/questoes/2037063
U.S. domestic air conditioning use could exceed electric capacity in next decade due to climate change
Climate change will provoke an increase in summer air conditioning use in the United States that will probably cause prolonged blackouts during peak summer heat if
states do not expand capacity or improve efficiency, according to a new study of domestic-level demand.
Human emissions have put the global climate on a trajectory to exceed 1.5 degrees Celsius of warming by the early 2030s, the IPCC reported in its 2021 evaluation.
Without significant alleviation, global temperatures will probably exceed the 2.0-degree Celsius limit by the end of the century.
Previous research has examined the impacts of higher future temperatures on annual electricity consumption for specific cities or states. The new study is the first to
project residential air conditioning demand on a domestic basis at a wide scale. It incorporates observed and predicted air temperature and heat, humidity and discomfort
indices with air conditioning use by statistically representative domiciles across the contiguous United States, collected by the U.S. Energy Information Administration (EIA)
in 2005-2019.
“It’s a pretty clear warning to all of us that we can’t keep doing what we are doing or our energy system will fail completely in the next few decades, simply because of the
summertime air conditioning,” said Susanne Benz, a geographer and climate scientist at Dalhousie University in Halifax, Nova Scotia.
The heaviest air conditioning use with the greatest risk for overcharging the transmission lines comes during heat waves, which also present the highest risk to health.
Electricity generation tends to be below peak during heat waves as well, reducing capacity to even lower levels, said Renee Obringer, an environmental engineer at Penn
State University. Without enough capacity to satisfy demand, energy companies may have to adopt systematic blackouts during heat waves to avoid network failure, like
California’s energy organizations did in August 2020 during an extended period of record heat sometimes topping 117 degrees Fahrenheit. “We’ve seen this in California
already -- state power companies had to institute blackouts because they couldn’t provide the needed electricity,” Obringer said. The state attributed 599 deaths to the
heat, but the true number may have been closer to 3,900.
The new study predicted the largest increases in kilowatt-hours of electricity demand in the already hot south and southwest. If all Arizona houses were to increase air
conditioning use by the estimated 6% needed at 1.5 degrees Celsius of global warming, for example, amounting to 30 kilowatt-hours per month, this would place an
additional 54.5 million kilowatthours of demand on the electrical network monthly.
www.tecconcursos.com.br/questoes/2037067
U.S. domestic air conditioning use could exceed electric capacity in next decade due to climate change
Climate change will provoke an increase in summer air conditioning use in the United States that will probably cause prolonged blackouts during peak summer heat if
states do not expand capacity or improve efficiency, according to a new study of domestic-level demand.
Human emissions have put the global climate on a trajectory to exceed 1.5 degrees Celsius of warming by the early 2030s, the IPCC reported in its 2021 evaluation.
Without significant alleviation, global temperatures will probably exceed the 2.0-degree Celsius limit by the end of the century.
Previous research has examined the impacts of higher future temperatures on annual electricity consumption for specific cities or states. The new study is the first to
project residential air conditioning demand on a domestic basis at a wide scale. It incorporates observed and predicted air temperature and heat, humidity and discomfort
indices with air conditioning use by statistically representative domiciles across the contiguous United States, collected by the U.S. Energy Information Administration (EIA)
in 2005-2019.
“It’s a pretty clear warning to all of us that we can’t keep doing what we are doing or our energy system will fail completely in the next few decades, simply because of the
summertime air conditioning,” said Susanne Benz, a geographer and climate scientist at Dalhousie University in Halifax, Nova Scotia.
The heaviest air conditioning use with the greatest risk for overcharging the transmission lines comes during heat waves, which also present the highest risk to health.
Electricity generation tends to be below peak during heat waves as well, reducing capacity to even lower levels, said Renee Obringer, an environmental engineer at Penn
State University. Without enough capacity to satisfy demand, energy companies may have to adopt systematic blackouts during heat waves to avoid network failure, like
California’s energy organizations did in August 2020 during an extended period of record heat sometimes topping 117 degrees Fahrenheit. “We’ve seen this in California
already -- state power companies had to institute blackouts because they couldn’t provide the needed electricity,” Obringer said. The state attributed 599 deaths to the
heat, but the true number may have been closer to 3,900.
The new study predicted the largest increases in kilowatt-hours of electricity demand in the already hot south and southwest. If all Arizona houses were to increase air
conditioning use by the estimated 6% needed at 1.5 degrees Celsius of global warming, for example, amounting to 30 kilowatt-hours per month, this would place an
additional 54.5 million kilowatthours of demand on the electrical network monthly.
In the 2nd paragraph, it is noticed that, according to the IPCC report in 2021, the global temperature will probably rise 1.5 degrees Celsius by the early 2030s due to
a) air conditioning use
b) human emissions
c) electricity consumption
d) electric capacity overcharge
e) blackouts
www.tecconcursos.com.br/questoes/2037070
U.S. domestic air conditioning use could exceed electric capacity in next decade due to climate change
Climate change will provoke an increase in summer air conditioning use in the United States that will probably cause prolonged blackouts during peak summer heat if
states do not expand capacity or improve efficiency, according to a new study of domestic-level demand.
Human emissions have put the global climate on a trajectory to exceed 1.5 degrees Celsius of warming by the early 2030s, the IPCC reported in its 2021 evaluation.
Without significant alleviation, global temperatures will probably exceed the 2.0-degree Celsius limit by the end of the century.
“It’s a pretty clear warning to all of us that we can’t keep doing what we are doing or our energy system will fail completely in the next few decades, simply because of the
summertime air conditioning,” said Susanne Benz, a geographer and climate scientist at Dalhousie University in Halifax, Nova Scotia.
The heaviest air conditioning use with the greatest risk for overcharging the transmission lines comes during heat waves, which also present the highest risk to health.
Electricity generation tends to be below peak during heat waves as well, reducing capacity to even lower levels, said Renee Obringer, an environmental engineer at Penn
State University. Without enough capacity to satisfy demand, energy companies may have to adopt systematic blackouts during heat waves to avoid network failure, like
California’s energy organizations did in August 2020 during an extended period of record heat sometimes topping 117 degrees Fahrenheit. “We’ve seen this in California
already -- state power companies had to institute blackouts because they couldn’t provide the needed electricity,” Obringer said. The state attributed 599 deaths to the
heat, but the true number may have been closer to 3,900.
The new study predicted the largest increases in kilowatt-hours of electricity demand in the already hot south and southwest. If all Arizona houses were to increase air
conditioning use by the estimated 6% needed at 1.5 degrees Celsius of global warming, for example, amounting to 30 kilowatt-hours per month, this would place an
additional 54.5 million kilowatthours of demand on the electrical network monthly.
The fragment in paragraph 5 “Electricity generation tends to be below peak” means that
a) there is usually no electricity left by that time of year.
b) electricity generation is not at its maximum capacity.
c) the quality of electricity generation is not acceptable.
d) excess electricity is being generated.
e) the electricity companies easily satisfy the increased demand.
www.tecconcursos.com.br/questoes/2037073
U.S. domestic air conditioning use could exceed electric capacity in next decade due to climate change
Climate change will provoke an increase in summer air conditioning use in the United States that will probably cause prolonged blackouts during peak summer heat if
states do not expand capacity or improve efficiency, according to a new study of domestic-level demand.
Human emissions have put the global climate on a trajectory to exceed 1.5 degrees Celsius of warming by the early 2030s, the IPCC reported in its 2021 evaluation.
Without significant alleviation, global temperatures will probably exceed the 2.0-degree Celsius limit by the end of the century.
Previous research has examined the impacts of higher future temperatures on annual electricity consumption for specific cities or states. The new study is the first to
project residential air conditioning demand on a domestic basis at a wide scale. It incorporates observed and predicted air temperature and heat, humidity and discomfort
indices with air conditioning use by statistically representative domiciles across the contiguous United States, collected by the U.S. Energy Information Administration (EIA)
in 2005-2019.
“It’s a pretty clear warning to all of us that we can’t keep doing what we are doing or our energy system will fail completely in the next few decades, simply because of the
summertime air conditioning,” said Susanne Benz, a geographer and climate scientist at Dalhousie University in Halifax, Nova Scotia.
The heaviest air conditioning use with the greatest risk for overcharging the transmission lines comes during heat waves, which also present the highest risk to health.
Electricity generation tends to be below peak during heat waves as well, reducing capacity to even lower levels, said Renee Obringer, an environmental engineer at Penn
State University. Without enough capacity to satisfy demand, energy companies may have to adopt systematic blackouts during heat waves to avoid network failure, like
California’s energy organizations did in August 2020 during an extended period of record heat sometimes topping 117 degrees Fahrenheit. “We’ve seen this in California
already -- state power companies had to institute blackouts because they couldn’t provide the needed electricity,” Obringer said. The state attributed 599 deaths to the
heat, but the true number may have been closer to 3,900.
The new study predicted the largest increases in kilowatt-hours of electricity demand in the already hot south and southwest. If all Arizona houses were to increase air
conditioning use by the estimated 6% needed at 1.5 degrees Celsius of global warming, for example, amounting to 30 kilowatt-hours per month, this would place an
additional 54.5 million kilowatthours of demand on the electrical network monthly.
The fragment in paragraph 5 “an extended period of record heat sometimes topping 117 degrees Fahrenheit” describes a climate condition characterized by
a) low and mild temperatures
b) quickly oscillating temperatures
c) exceptionally high temperatures
d) alternating hot and dry weather
e) moderate temperatures and bad weather
www.tecconcursos.com.br/questoes/2039988
President Joe Biden has set ambitious goals for fighting climate change: To cut U.S. carbon emissions in half by 2030 and to have a net-zero carbon economy by 2050. The
plan requires electricity generation – the easiest economic sector to green, analysts say – to be carbon-free by 2035.
A few figures from the U.S. Energy Information Administration (EIA) illustrate the challenge. In 2020 the United States generated about four trillion kilowatt-hours of
electricity. Some 60 percent of that came from burning fossil fuels, mostly natural gas, in some 10,000 generators, large and small, around the country. All of that
electricity will need to be replaced - and more, because demand for electricity is expected to rise, especially if we power more cars with it.
Renewable energy sources like solar and wind have grown faster than expected; together with hydroelectric, they surpassed coal for the first time ever in 2019 and now
produce 20 percent of U.S. electricity. In February the EIA projected that renewables were on track to produce more than 40 percent by 2050 - remarkable growth,
perhaps, but still well short of what’s needed to decarbonize the grid by 2035 and forestall the climate crisis.
This daunting challenge has recently led some environmentalists to reconsider an alternative they had long been wary of: nuclear power.
Nuclear power has a lot going for it. Its carbon footprint is equivalent to wind, less than solar, and orders of magnitude less than coal. Nuclear power plants take up far
less space on the landscape than solar or wind farms, and they produce power even at night or on calm days. In 2020 they generated as much electricity in the U.S. as
renewables did, a fifth of the total.
But debates rage over whether nuclear should be a big part of the climate solution in the U.S. The majority of American nuclear plants today are approaching the end of
their design life, and only one has been built in the last 20 years. Nuclear proponents are now banking on next-generation designs, like small, modular versions of
conventional light-water reactors, or advanced reactors designed to be safer, cheaper, and more flexible.
“We’ve innovated so little in the past half-century, there’s a lot of ground to gain,” says Ashley Finan, the director of the National Reactor Innovation Center at the Idaho
National Laboratory. Yet an expansion of nuclear power faces some serious hurdles, and the perennial concerns about safety and long-lived radioactive waste may not be
the biggest: Critics also say nuclear reactors are simply too expensive and take too long to build to be of much help with the climate crisis.
While environmental opposition may have been the primary force hindering nuclear development in the 1980s and 90s, now the biggest challenge may be costs. Few
nuclear plants have been built in the U.S. recently because they are very expensive to build here, which makes the price of their energy high.
Jacopo Buongiorno, a professor of nuclear science and engineering at MIT, led a group of scientists who recently completed a two-year study examining the future of
nuclear energy in the U.S. and western Europe. They found that “without cost reductions, nuclear energy will not play a significant role” in decarbonizing the power sector.
“In the West, the nuclear industry has substantially lost its ability to build large plants,” Buongiorno says, pointing to Southern Company’s effort to add two new reactors to
Plant Vogtle in Waynesboro, Georgia. They have been under construction since 2013, are now billions of dollars over budget - the cost has more than doubled - and years
behind schedule. In France, ranked second after the U.S. in nuclear generation, a new reactor in Flamanville is a decade late and more than three times over budget.
“We have clearly lost the know-how to build traditional gigawatt-scale nuclear power plants,” Buongiorno says. Because no new plants were built in the U.S. for decades,
he and his colleagues found, the teams working on a project like Vogtle haven’t had the learning experiences needed to do the job efficiently. That leads to construction
delays that drive up costs.
Elsewhere, reactors are still being built at lower cost, “largely in places where they build projects on budget, and on schedule,” Finan explains. China and South Korea are
the leaders. (To be fair, several of China’s recent large-scale reactors have also had cost overruns and delays.)
“The cost of nuclear power in Asia has been a quarter, or less, of new builds in the West,” Finan says. Much lower labor costs are one reason, according to both Finan and
the MIT report, but better project management is another.
In the fragment of paragraph 2 “because demand for electricity is expected to rise, especially if we power more cars with it”, is expected to rise is used to
www.tecconcursos.com.br/questoes/2039989
President Joe Biden has set ambitious goals for fighting climate change: To cut U.S. carbon emissions in half by 2030 and to have a net-zero carbon economy by 2050. The
plan requires electricity generation – the easiest economic sector to green, analysts say – to be carbon-free by 2035.
A few figures from the U.S. Energy Information Administration (EIA) illustrate the challenge. In 2020 the United States generated about four trillion kilowatt-hours of
electricity. Some 60 percent of that came from burning fossil fuels, mostly natural gas, in some 10,000 generators, large and small, around the country. All of that
electricity will need to be replaced - and more, because demand for electricity is expected to rise, especially if we power more cars with it.
Renewable energy sources like solar and wind have grown faster than expected; together with hydroelectric, they surpassed coal for the first time ever in 2019 and now
produce 20 percent of U.S. electricity. In February the EIA projected that renewables were on track to produce more than 40 percent by 2050 - remarkable growth,
perhaps, but still well short of what’s needed to decarbonize the grid by 2035 and forestall the climate crisis.
This daunting challenge has recently led some environmentalists to reconsider an alternative they had long been wary of: nuclear power.
Nuclear power has a lot going for it. Its carbon footprint is equivalent to wind, less than solar, and orders of magnitude less than coal. Nuclear power plants take up far
less space on the landscape than solar or wind farms, and they produce power even at night or on calm days. In 2020 they generated as much electricity in the U.S. as
renewables did, a fifth of the total.
But debates rage over whether nuclear should be a big part of the climate solution in the U.S. The majority of American nuclear plants today are approaching the end of
their design life, and only one has been built in the last 20 years. Nuclear proponents are now banking on next-generation designs, like small, modular versions of
conventional light-water reactors, or advanced reactors designed to be safer, cheaper, and more flexible.
“We’ve innovated so little in the past half-century, there’s a lot of ground to gain,” says Ashley Finan, the director of the National Reactor Innovation Center at the Idaho
National Laboratory. Yet an expansion of nuclear power faces some serious hurdles, and the perennial concerns about safety and long-lived radioactive waste may not be
the biggest: Critics also say nuclear reactors are simply too expensive and take too long to build to be of much help with the climate crisis.
While environmental opposition may have been the primary force hindering nuclear development in the 1980s and 90s, now the biggest challenge may be costs. Few
nuclear plants have been built in the U.S. recently because they are very expensive to build here, which makes the price of their energy high.
Jacopo Buongiorno, a professor of nuclear science and engineering at MIT, led a group of scientists who recently completed a two-year study examining the future of
nuclear energy in the U.S. and western Europe. They found that “without cost reductions, nuclear energy will not play a significant role” in decarbonizing the power sector.
“In the West, the nuclear industry has substantially lost its ability to build large plants,” Buongiorno says, pointing to Southern Company’s effort to add two new reactors to
Plant Vogtle in Waynesboro, Georgia. They have been under construction since 2013, are now billions of dollars over budget - the cost has more than doubled - and years
behind schedule. In France, ranked second after the U.S. in nuclear generation, a new reactor in Flamanville is a decade late and more than three times over budget.
“We have clearly lost the know-how to build traditional gigawatt-scale nuclear power plants,” Buongiorno says. Because no new plants were built in the U.S. for decades,
he and his colleagues found, the teams working on a project like Vogtle haven’t had the learning experiences needed to do the job efficiently. That leads to construction
delays that drive up costs.
Elsewhere, reactors are still being built at lower cost, “largely in places where they build projects on budget, and on schedule,” Finan explains. China and South Korea are
the leaders. (To be fair, several of China’s recent large-scale reactors have also had cost overruns and delays.)
“The cost of nuclear power in Asia has been a quarter, or less, of new builds in the West,” Finan says. Much lower labor costs are one reason, according to both Finan and
the MIT report, but better project management is another.
www.tecconcursos.com.br/questoes/2039991
President Joe Biden has set ambitious goals for fighting climate change: To cut U.S. carbon emissions in half by 2030 and to have a net-zero carbon economy by 2050. The
plan requires electricity generation – the easiest economic sector to green, analysts say – to be carbon-free by 2035.
A few figures from the U.S. Energy Information Administration (EIA) illustrate the challenge. In 2020 the United States generated about four trillion kilowatt-hours of
electricity. Some 60 percent of that came from burning fossil fuels, mostly natural gas, in some 10,000 generators, large and small, around the country. All of that
electricity will need to be replaced - and more, because demand for electricity is expected to rise, especially if we power more cars with it.
Renewable energy sources like solar and wind have grown faster than expected; together with hydroelectric, they surpassed coal for the first time ever in 2019 and now
produce 20 percent of U.S. electricity. In February the EIA projected that renewables were on track to produce more than 40 percent by 2050 - remarkable growth,
perhaps, but still well short of what’s needed to decarbonize the grid by 2035 and forestall the climate crisis.
This daunting challenge has recently led some environmentalists to reconsider an alternative they had long been wary of: nuclear power.
Nuclear power has a lot going for it. Its carbon footprint is equivalent to wind, less than solar, and orders of magnitude less than coal. Nuclear power plants take up far
less space on the landscape than solar or wind farms, and they produce power even at night or on calm days. In 2020 they generated as much electricity in the U.S. as
renewables did, a fifth of the total.
But debates rage over whether nuclear should be a big part of the climate solution in the U.S. The majority of American nuclear plants today are approaching the end of
their design life, and only one has been built in the last 20 years. Nuclear proponents are now banking on next-generation designs, like small, modular versions of
conventional light-water reactors, or advanced reactors designed to be safer, cheaper, and more flexible.
“We’ve innovated so little in the past half-century, there’s a lot of ground to gain,” says Ashley Finan, the director of the National Reactor Innovation Center at the Idaho
National Laboratory. Yet an expansion of nuclear power faces some serious hurdles, and the perennial concerns about safety and long-lived radioactive waste may not be
the biggest: Critics also say nuclear reactors are simply too expensive and take too long to build to be of much help with the climate crisis.
While environmental opposition may have been the primary force hindering nuclear development in the 1980s and 90s, now the biggest challenge may be costs. Few
nuclear plants have been built in the U.S. recently because they are very expensive to build here, which makes the price of their energy high.
Jacopo Buongiorno, a professor of nuclear science and engineering at MIT, led a group of scientists who recently completed a two-year study examining the future of
nuclear energy in the U.S. and western Europe. They found that “without cost reductions, nuclear energy will not play a significant role” in decarbonizing the power sector.
“In the West, the nuclear industry has substantially lost its ability to build large plants,” Buongiorno says, pointing to Southern Company’s effort to add two new reactors to
Plant Vogtle in Waynesboro, Georgia. They have been under construction since 2013, are now billions of dollars over budget - the cost has more than doubled - and years
behind schedule. In France, ranked second after the U.S. in nuclear generation, a new reactor in Flamanville is a decade late and more than three times over budget.
“We have clearly lost the know-how to build traditional gigawatt-scale nuclear power plants,” Buongiorno says. Because no new plants were built in the U.S. for decades,
he and his colleagues found, the teams working on a project like Vogtle haven’t had the learning experiences needed to do the job efficiently. That leads to construction
delays that drive up costs.
Elsewhere, reactors are still being built at lower cost, “largely in places where they build projects on budget, and on schedule,” Finan explains. China and South Korea are
the leaders. (To be fair, several of China’s recent large-scale reactors have also had cost overruns and delays.)
“The cost of nuclear power in Asia has been a quarter, or less, of new builds in the West,” Finan says. Much lower labor costs are one reason, according to both Finan and
the MIT report, but better project management is another.
In the fragment of paragraph 5 “Nuclear power has a lot going for it” means that the use of nuclear power
a) presents many advantageous qualities.
b) generates some doubts about its efficiency.
c) constitutes a real threat to national security.
d) raises severe concerns about potential accidents.
e) provokes negative reactions among environmentalists.
www.tecconcursos.com.br/questoes/2039995
President Joe Biden has set ambitious goals for fighting climate change: To cut U.S. carbon emissions in half by 2030 and to have a net-zero carbon economy by 2050. The
plan requires electricity generation – the easiest economic sector to green, analysts say – to be carbon-free by 2035.
A few figures from the U.S. Energy Information Administration (EIA) illustrate the challenge. In 2020 the United States generated about four trillion kilowatt-hours of
electricity. Some 60 percent of that came from burning fossil fuels, mostly natural gas, in some 10,000 generators, large and small, around the country. All of that
electricity will need to be replaced - and more, because demand for electricity is expected to rise, especially if we power more cars with it.
Renewable energy sources like solar and wind have grown faster than expected; together with hydroelectric, they surpassed coal for the first time ever in 2019 and now
produce 20 percent of U.S. electricity. In February the EIA projected that renewables were on track to produce more than 40 percent by 2050 - remarkable growth,
perhaps, but still well short of what’s needed to decarbonize the grid by 2035 and forestall the climate crisis.
This daunting challenge has recently led some environmentalists to reconsider an alternative they had long been wary of: nuclear power.
Nuclear power has a lot going for it. Its carbon footprint is equivalent to wind, less than solar, and orders of magnitude less than coal. Nuclear power plants take up far
less space on the landscape than solar or wind farms, and they produce power even at night or on calm days. In 2020 they generated as much electricity in the U.S. as
renewables did, a fifth of the total.
But debates rage over whether nuclear should be a big part of the climate solution in the U.S. The majority of American nuclear plants today are approaching the end of
their design life, and only one has been built in the last 20 years. Nuclear proponents are now banking on next-generation designs, like small, modular versions of
conventional light-water reactors, or advanced reactors designed to be safer, cheaper, and more flexible.
“We’ve innovated so little in the past half-century, there’s a lot of ground to gain,” says Ashley Finan, the director of the National Reactor Innovation Center at the Idaho
National Laboratory. Yet an expansion of nuclear power faces some serious hurdles, and the perennial concerns about safety and long-lived radioactive waste may not be
the biggest: Critics also say nuclear reactors are simply too expensive and take too long to build to be of much help with the climate crisis.
While environmental opposition may have been the primary force hindering nuclear development in the 1980s and 90s, now the biggest challenge may be costs. Few
nuclear plants have been built in the U.S. recently because they are very expensive to build here, which makes the price of their energy high.
Jacopo Buongiorno, a professor of nuclear science and engineering at MIT, led a group of scientists who recently completed a two-year study examining the future of
nuclear energy in the U.S. and western Europe. They found that “without cost reductions, nuclear energy will not play a significant role” in decarbonizing the power sector.
“In the West, the nuclear industry has substantially lost its ability to build large plants,” Buongiorno says, pointing to Southern Company’s effort to add two new reactors to
Plant Vogtle in Waynesboro, Georgia. They have been under construction since 2013, are now billions of dollars over budget - the cost has more than doubled - and years
behind schedule. In France, ranked second after the U.S. in nuclear generation, a new reactor in Flamanville is a decade late and more than three times over budget.
“We have clearly lost the know-how to build traditional gigawatt-scale nuclear power plants,” Buongiorno says. Because no new plants were built in the U.S. for decades,
he and his colleagues found, the teams working on a project like Vogtle haven’t had the learning experiences needed to do the job efficiently. That leads to construction
delays that drive up costs.
Elsewhere, reactors are still being built at lower cost, “largely in places where they build projects on budget, and on schedule,” Finan explains. China and South Korea are
the leaders. (To be fair, several of China’s recent large-scale reactors have also had cost overruns and delays.)
“The cost of nuclear power in Asia has been a quarter, or less, of new builds in the West,” Finan says. Much lower labor costs are one reason, according to both Finan and
the MIT report, but better project management is another.
In the fragment of paragraph 7 “and the perennial concerns about safety and long-lived radioactive waste may not be the biggest”, may not be expresses a(n)
a) possibility
b) obligation
c) necessity
d) certainty
e) ability
www.tecconcursos.com.br/questoes/2039996
A few figures from the U.S. Energy Information Administration (EIA) illustrate the challenge. In 2020 the United States generated about four trillion kilowatt-hours of
electricity. Some 60 percent of that came from burning fossil fuels, mostly natural gas, in some 10,000 generators, large and small, around the country. All of that
electricity will need to be replaced - and more, because demand for electricity is expected to rise, especially if we power more cars with it.
Renewable energy sources like solar and wind have grown faster than expected; together with hydroelectric, they surpassed coal for the first time ever in 2019 and now
produce 20 percent of U.S. electricity. In February the EIA projected that renewables were on track to produce more than 40 percent by 2050 - remarkable growth,
perhaps, but still well short of what’s needed to decarbonize the grid by 2035 and forestall the climate crisis.
This daunting challenge has recently led some environmentalists to reconsider an alternative they had long been wary of: nuclear power.
Nuclear power has a lot going for it. Its carbon footprint is equivalent to wind, less than solar, and orders of magnitude less than coal. Nuclear power plants take up far
less space on the landscape than solar or wind farms, and they produce power even at night or on calm days. In 2020 they generated as much electricity in the U.S. as
renewables did, a fifth of the total.
But debates rage over whether nuclear should be a big part of the climate solution in the U.S. The majority of American nuclear plants today are approaching the end of
their design life, and only one has been built in the last 20 years. Nuclear proponents are now banking on next-generation designs, like small, modular versions of
conventional light-water reactors, or advanced reactors designed to be safer, cheaper, and more flexible.
“We’ve innovated so little in the past half-century, there’s a lot of ground to gain,” says Ashley Finan, the director of the National Reactor Innovation Center at the Idaho
National Laboratory. Yet an expansion of nuclear power faces some serious hurdles, and the perennial concerns about safety and long-lived radioactive waste may not be
the biggest: Critics also say nuclear reactors are simply too expensive and take too long to build to be of much help with the climate crisis.
While environmental opposition may have been the primary force hindering nuclear development in the 1980s and 90s, now the biggest challenge may be costs. Few
nuclear plants have been built in the U.S. recently because they are very expensive to build here, which makes the price of their energy high.
Jacopo Buongiorno, a professor of nuclear science and engineering at MIT, led a group of scientists who recently completed a two-year study examining the future of
nuclear energy in the U.S. and western Europe. They found that “without cost reductions, nuclear energy will not play a significant role” in decarbonizing the power sector.
“In the West, the nuclear industry has substantially lost its ability to build large plants,” Buongiorno says, pointing to Southern Company’s effort to add two new reactors to
Plant Vogtle in Waynesboro, Georgia. They have been under construction since 2013, are now billions of dollars over budget - the cost has more than doubled - and years
behind schedule. In France, ranked second after the U.S. in nuclear generation, a new reactor in Flamanville is a decade late and more than three times over budget.
“We have clearly lost the know-how to build traditional gigawatt-scale nuclear power plants,” Buongiorno says. Because no new plants were built in the U.S. for decades,
he and his colleagues found, the teams working on a project like Vogtle haven’t had the learning experiences needed to do the job efficiently. That leads to construction
delays that drive up costs.
Elsewhere, reactors are still being built at lower cost, “largely in places where they build projects on budget, and on schedule,” Finan explains. China and South Korea are
the leaders. (To be fair, several of China’s recent large-scale reactors have also had cost overruns and delays.)
“The cost of nuclear power in Asia has been a quarter, or less, of new builds in the West,” Finan says. Much lower labor costs are one reason, according to both Finan and
the MIT report, but better project management is another.
According to Jacopo Buongiorno, one of the reasons why it is more expensive to build large nuclear plants in the West is that
www.tecconcursos.com.br/questoes/2039997
President Joe Biden has set ambitious goals for fighting climate change: To cut U.S. carbon emissions in half by 2030 and to have a net-zero carbon economy by 2050. The
plan requires electricity generation – the easiest economic sector to green, analysts say – to be carbon-free by 2035.
A few figures from the U.S. Energy Information Administration (EIA) illustrate the challenge. In 2020 the United States generated about four trillion kilowatt-hours of
electricity. Some 60 percent of that came from burning fossil fuels, mostly natural gas, in some 10,000 generators, large and small, around the country. All of that
electricity will need to be replaced - and more, because demand for electricity is expected to rise, especially if we power more cars with it.
Renewable energy sources like solar and wind have grown faster than expected; together with hydroelectric, they surpassed coal for the first time ever in 2019 and now
produce 20 percent of U.S. electricity. In February the EIA projected that renewables were on track to produce more than 40 percent by 2050 - remarkable growth,
perhaps, but still well short of what’s needed to decarbonize the grid by 2035 and forestall the climate crisis.
This daunting challenge has recently led some environmentalists to reconsider an alternative they had long been wary of: nuclear power.
Nuclear power has a lot going for it. Its carbon footprint is equivalent to wind, less than solar, and orders of magnitude less than coal. Nuclear power plants take up far
less space on the landscape than solar or wind farms, and they produce power even at night or on calm days. In 2020 they generated as much electricity in the U.S. as
renewables did, a fifth of the total.
But debates rage over whether nuclear should be a big part of the climate solution in the U.S. The majority of American nuclear plants today are approaching the end of
their design life, and only one has been built in the last 20 years. Nuclear proponents are now banking on next-generation designs, like small, modular versions of
conventional light-water reactors, or advanced reactors designed to be safer, cheaper, and more flexible.
While environmental opposition may have been the primary force hindering nuclear development in the 1980s and 90s, now the biggest challenge may be costs. Few
nuclear plants have been built in the U.S. recently because they are very expensive to build here, which makes the price of their energy high.
Jacopo Buongiorno, a professor of nuclear science and engineering at MIT, led a group of scientists who recently completed a two-year study examining the future of
nuclear energy in the U.S. and western Europe. They found that “without cost reductions, nuclear energy will not play a significant role” in decarbonizing the power sector.
“In the West, the nuclear industry has substantially lost its ability to build large plants,” Buongiorno says, pointing to Southern Company’s effort to add two new reactors to
Plant Vogtle in Waynesboro, Georgia. They have been under construction since 2013, are now billions of dollars over budget - the cost has more than doubled - and years
behind schedule. In France, ranked second after the U.S. in nuclear generation, a new reactor in Flamanville is a decade late and more than three times over budget.
“We have clearly lost the know-how to build traditional gigawatt-scale nuclear power plants,” Buongiorno says. Because no new plants were built in the U.S. for decades,
he and his colleagues found, the teams working on a project like Vogtle haven’t had the learning experiences needed to do the job efficiently. That leads to construction
delays that drive up costs.
Elsewhere, reactors are still being built at lower cost, “largely in places where they build projects on budget, and on schedule,” Finan explains. China and South Korea are
the leaders. (To be fair, several of China’s recent large-scale reactors have also had cost overruns and delays.)
“The cost of nuclear power in Asia has been a quarter, or less, of new builds in the West,” Finan says. Much lower labor costs are one reason, according to both Finan and
the MIT report, but better project management is another.
In paragraph 12, the author affirms “(To be fair, several of China’s recent large-scale reactors have also had cost overruns and delays)”, in order to
a) clarify that China has also faced problems with the construction of large-scale nuclear reactors.
b) praise China’s capacity of building large-scale nuclear reactors fast and effectively.
c) explain that China is more efficient that South Korea when building large-scale nuclear reactors.
d) support the view that China and South Korea can build projects on budget and on schedule.
e) discuss the reasons why China and South Korea can build nuclear reactors at a lower cost.
www.tecconcursos.com.br/questoes/2039998
President Joe Biden has set ambitious goals for fighting climate change: To cut U.S. carbon emissions in half by 2030 and to have a net-zero carbon economy by 2050. The
plan requires electricity generation – the easiest economic sector to green, analysts say – to be carbon-free by 2035.
A few figures from the U.S. Energy Information Administration (EIA) illustrate the challenge. In 2020 the United States generated about four trillion kilowatt-hours of
electricity. Some 60 percent of that came from burning fossil fuels, mostly natural gas, in some 10,000 generators, large and small, around the country. All of that
electricity will need to be replaced - and more, because demand for electricity is expected to rise, especially if we power more cars with it.
Renewable energy sources like solar and wind have grown faster than expected; together with hydroelectric, they surpassed coal for the first time ever in 2019 and now
produce 20 percent of U.S. electricity. In February the EIA projected that renewables were on track to produce more than 40 percent by 2050 - remarkable growth,
perhaps, but still well short of what’s needed to decarbonize the grid by 2035 and forestall the climate crisis.
This daunting challenge has recently led some environmentalists to reconsider an alternative they had long been wary of: nuclear power.
Nuclear power has a lot going for it. Its carbon footprint is equivalent to wind, less than solar, and orders of magnitude less than coal. Nuclear power plants take up far
less space on the landscape than solar or wind farms, and they produce power even at night or on calm days. In 2020 they generated as much electricity in the U.S. as
renewables did, a fifth of the total.
But debates rage over whether nuclear should be a big part of the climate solution in the U.S. The majority of American nuclear plants today are approaching the end of
their design life, and only one has been built in the last 20 years. Nuclear proponents are now banking on next-generation designs, like small, modular versions of
conventional light-water reactors, or advanced reactors designed to be safer, cheaper, and more flexible.
“We’ve innovated so little in the past half-century, there’s a lot of ground to gain,” says Ashley Finan, the director of the National Reactor Innovation Center at the Idaho
National Laboratory. Yet an expansion of nuclear power faces some serious hurdles, and the perennial concerns about safety and long-lived radioactive waste may not be
the biggest: Critics also say nuclear reactors are simply too expensive and take too long to build to be of much help with the climate crisis.
While environmental opposition may have been the primary force hindering nuclear development in the 1980s and 90s, now the biggest challenge may be costs. Few
nuclear plants have been built in the U.S. recently because they are very expensive to build here, which makes the price of their energy high.
Jacopo Buongiorno, a professor of nuclear science and engineering at MIT, led a group of scientists who recently completed a two-year study examining the future of
nuclear energy in the U.S. and western Europe. They found that “without cost reductions, nuclear energy will not play a significant role” in decarbonizing the power sector.
“In the West, the nuclear industry has substantially lost its ability to build large plants,” Buongiorno says, pointing to Southern Company’s effort to add two new reactors to
Plant Vogtle in Waynesboro, Georgia. They have been under construction since 2013, are now billions of dollars over budget - the cost has more than doubled - and years
behind schedule. In France, ranked second after the U.S. in nuclear generation, a new reactor in Flamanville is a decade late and more than three times over budget.
“We have clearly lost the know-how to build traditional gigawatt-scale nuclear power plants,” Buongiorno says. Because no new plants were built in the U.S. for decades,
he and his colleagues found, the teams working on a project like Vogtle haven’t had the learning experiences needed to do the job efficiently. That leads to construction
delays that drive up costs.
Elsewhere, reactors are still being built at lower cost, “largely in places where they build projects on budget, and on schedule,” Finan explains. China and South Korea are
the leaders. (To be fair, several of China’s recent large-scale reactors have also had cost overruns and delays.)
In the last paragraph, the author states that “Much lower labor costs are one reason, according to both Finan and the MIT report, but better project management is
another.” because he believes that
a) both Finan and the MIT report are absolutely wrong in their conclusions.
b) it is difficult to determine the reasons why nuclear power costs less in Asia.
c) nuclear power is cheaper in Asia just because of better project management.
d) neither project management nor labor costs explain the low cost of nuclear energy in Asia.
e) lower labor costs are just part of the reason why nuclear power is less expensive in Asia.
www.tecconcursos.com.br/questoes/1756025
If you think a robot will steal your job, you are not alone. Soccer players should be worried too. The next Messi probably won’t be of flesh and blood but plastic and metal.
The concept emerged during the conference “Workshop on grand challenges in artificial intelligence,” held in Tokyo in 1992, and independently, in 1993, when Professor
Alan Mackworth from the University of Bristol in Canada described an experiment with small soccer players in a scientific article.
Over 40 teams already participated in the first RoboCup tournament in 1997, and the competition is held every year. The RoboCup Federation wants to play and win a
game against a real-world cup humans’ team by 2050.
The idea behind artificially intelligent players is to investigate how robots perceive motion and communicate with each other. Physical abilities like walking, running, and
kicking the ball while maintaining balance are crucial to improving robots for other tasks like rescue, home, industry, and education.
Designing robots for sports requires much more than experts in state-of-the-art technology. Humans and machines do not share the same skills. Engineers need to impose
limitations on soccer robots to imitate soccer players as much as possible and ensure following the game’s rules.
RoboCup Soccer Federation, the “FIFA” of robots, which supports five leagues, imposes restrictions on players’ design and rules of the game. Each has its own robot
design and game rules to give room for different scientific goals. The number of players, their size, the ball type, and the field dimensions are different for each league.
In the humanoid league the players are humanlike robots with human-like senses. However, they are rather slow. Many of the skills needed to fully recreate actual soccer
player movements are still in the early stages of research.
The game becomes exciting for middle and small size leagues. The models are much simpler; they are just boxes with a cyclopean eye. Their design focuses on team
behavior: recognizing an opponent, cooperating with team members, receiving and giving a standard FIFA size ball.
Today, soccer robots are entirely autonomous. They wireless “talk” to each other, make decisions regarding strategy in real-time, replace an “injured” player, and shoot
goals. The only person in a RoboCup game is the referee. The team coaches are engineers in charge of training the RoboCups’ artificial intelligence for fair play: the
robots don’t smash against each other or pull their shirts.
The next RoboCup competition will soon be played, virtually, with rules that will allow teams to participate without establishing physical contact.
Available at:<https://www.ua-magazine.com/2021/05/12/robots-the-
-next-generation-of-soccer-players>. Retrieved on: July 4th, 2021.
Adapted.
According to the second paragraph, the concept of robotic soccer players emerged
a) in 1997
b) in the 1990s
c) before the 1990s
d) in the beginning of the 20th century
e) in the beginning of the 21st century
www.tecconcursos.com.br/questoes/1756050
If you think a robot will steal your job, you are not alone. Soccer players should be worried too. The next Messi probably won’t be of flesh and blood but plastic and metal.
The concept emerged during the conference “Workshop on grand challenges in artificial intelligence,” held in Tokyo in 1992, and independently, in 1993, when Professor
Alan Mackworth from the University of Bristol in Canada described an experiment with small soccer players in a scientific article.
Over 40 teams already participated in the first RoboCup tournament in 1997, and the competition is held every year. The RoboCup Federation wants to play and win a
game against a real-world cup humans’ team by 2050.
The idea behind artificially intelligent players is to investigate how robots perceive motion and communicate with each other. Physical abilities like walking, running, and
kicking the ball while maintaining balance are crucial to improving robots for other tasks like rescue, home, industry, and education.
Designing robots for sports requires much more than experts in state-of-the-art technology. Humans and machines do not share the same skills. Engineers need to impose
limitations on soccer robots to imitate soccer players as much as possible and ensure following the game’s rules.
RoboCup Soccer Federation, the “FIFA” of robots, which supports five leagues, imposes restrictions on players’ design and rules of the game. Each has its own robot
design and game rules to give room for different scientific goals. The number of players, their size, the ball type, and the field dimensions are different for each league.
In the humanoid league the players are humanlike robots with human-like senses. However, they are rather slow. Many of the skills needed to fully recreate actual soccer
player movements are still in the early stages of research.
The game becomes exciting for middle and small size leagues. The models are much simpler; they are just boxes with a cyclopean eye. Their design focuses on team
behavior: recognizing an opponent, cooperating with team members, receiving and giving a standard FIFA size ball.
Today, soccer robots are entirely autonomous. They wireless “talk” to each other, make decisions regarding strategy in real-time, replace an “injured” player, and shoot
goals. The only person in a RoboCup game is the referee. The team coaches are engineers in charge of training the RoboCups’ artificial intelligence for fair play: the
robots don’t smash against each other or pull their shirts.
The next RoboCup competition will soon be played, virtually, with rules that will allow teams to participate without establishing physical contact.
Available at:<https://www.ua-magazine.com/2021/05/12/robots-the-
-next-generation-of-soccer-players>. Retrieved on: July 4th, 2021.
Adapted.
In paragraph 9, there is the information that in RoboCup competitions the game referee and the team coaches are
a) humanoids
b) computers
c) real people
d) robotic engineers
e) virtual mechanisms
www.tecconcursos.com.br/questoes/1757147
Like all businesses, banks have had to act fast to respond to the unprecedented human and economic impact of Covid-19.
First, they needed to keep the lights on and ensure business continuity. Second, they had to meet the changing ways customers wanted to engage. Finally, they sought to
balance their business priorities with a responsibility to support society. Previous crises cast the banks as part of the problem — this time they are part of the solution.
Banks who have embraced modern banking technology have fared better in meeting these challenges. They’ve moved seamlessly to remote working, kept up service for
their customers, coped with huge increases in demand and quickly adapted their products. In contrast, banks using legacy ‘spaghetti’ software have struggled.
Covid-19 has accelerated the need for modern banking technology, but it didn’t create it. Before coronavirus, the 2020s were already being framed as the decade for digital
in the banking industry. Banks’ return on equity were too low and their cost-income ratios were too high. Meanwhile, regulation like open banking was disrupting the
industry and increasing competition from new entrants like the GAAFAs (Google, Amazon, Alibaba, Facebook, Apple).
Providing seamless digital customer experiences was therefore already a ‘must’. Every year, Temenos partners with the Economist Intelligence Unit (EIU) for a global study
on the future of banking. More than 300 banking leaders are interviewed from retail, commercial and private banks. Over half of these are at C-suite level.
In 2020, the study took place amid the Covid-19 crisis. The results give a fascinating insight into banking leaders’ approach during these unprecedented times. But they
also show how they see their industry in the years to come.
And the findings suggest three trends which will shape the future of banking:
1. New technologies will be the key driver of banking transformation over the next 5 years. 77% of respondents strongly believed that Artificial Intelligence (AI) will
be the most game-changing of these technologies. They see a diverse range of uses for AI — from personalised customer experience to fraud detection.
2. Banks will overhaul their business models to create digital ecosystems. 80% of respondents believe that banking will become part of a platform of services. 45%
are committed to transforming their business models into digital ecosystems.
3. The sun will set on branch banking. World Bank data shows that visits to branches have been steadily declining globally over the last decade. As a result of
coronavirus, customers are now more concerned about visiting their branch, and so even more people are willing to try digital applications. This combination of
pandemic and increasingly transformative advanced technology has led a majority of respondents (59%) to our survey with the EIU to state that traditional branch-
based banking model will be dead in just five years. That’s a 34% increase from last year.
The current environment is undoubtedly challenging for banks. But they have the capital, customer relationships and customer data. They are regulated. And most
importantly: they still enjoy their customers’ trust.
In short, banks are best-placed to succeed if they commit to end-to-end digital transformation. That means a fully digital front office which creates hyper-personalized
experiences and ecosystems. And a back office driving efficient operations and rapid innovation. By embracing modern banking technology, banks can support their
customers today, create new value for the future and drive new levels of future growth.
Like all businesses, banks have had to act fast to respond to the unprecedented human and economic impact of Covid-19.
First, they needed to keep the lights on and ensure business continuity. Second, they had to meet the changing ways customers wanted to engage. Finally, they sought to
balance their business priorities with a responsibility to support society. Previous crises cast the banks as part of the problem — this time they are part of the solution.
Banks who have embraced modern banking technology have fared better in meeting these challenges. They’ve moved seamlessly to remote working, kept up service for
their customers, coped with huge increases in demand and quickly adapted their products. In contrast, banks using legacy ‘spaghetti’ software have struggled.
Covid-19 has accelerated the need for modern banking technology, but it didn’t create it. Before coronavirus, the 2020s were already being framed as the decade for digital
in the banking industry. Banks’ return on equity were too low and their cost-income ratios were too high. Meanwhile, regulation like open banking was disrupting the
industry and increasing competition from new entrants like the GAAFAs (Google, Amazon, Alibaba, Facebook, Apple).
Providing seamless digital customer experiences was therefore already a ‘must’. Every year, Temenos partners with the Economist Intelligence Unit (EIU) for a global study
on the future of banking. More than 300 banking leaders are interviewed from retail, commercial and private banks. Over half of these are at C-suite level.
In 2020, the study took place amid the Covid-19 crisis. The results give a fascinating insight into banking leaders’ approach during these unprecedented times. But they
also show how they see their industry in the years to come.
And the findings suggest three trends which will shape the future of banking:
1. New technologies will be the key driver of banking transformation over the next 5 years. 77% of respondents strongly believed that Artificial Intelligence (AI) will
be the most game-changing of these technologies. They see a diverse range of uses for AI — from personalised customer experience to fraud detection.
2. Banks will overhaul their business models to create digital ecosystems. 80% of respondents believe that banking will become part of a platform of services. 45%
are committed to transforming their business models into digital ecosystems.
3. The sun will set on branch banking. World Bank data shows that visits to branches have been steadily declining globally over the last decade. As a result of
coronavirus, customers are now more concerned about visiting their branch, and so even more people are willing to try digital applications. This combination of
pandemic and increasingly transformative advanced technology has led a majority of respondents (59%) to our survey with the EIU to state that traditional branch-
based banking model will be dead in just five years. That’s a 34% increase from last year.
The current environment is undoubtedly challenging for banks. But they have the capital, customer relationships and customer data. They are regulated. And most
importantly: they still enjoy their customers’ trust.
In short, banks are best-placed to succeed if they commit to end-to-end digital transformation. That means a fully digital front office which creates hyper-personalized
experiences and ecosystems. And a back office driving efficient operations and rapid innovation. By embracing modern banking technology, banks can support their
customers today, create new value for the future and drive new levels of future growth.
According to the 2nd paragraph of the text, after the Covid-19 outbreak, banks initially had to face the following number of challenges:
a) 1
b) 2
c) 3
d) 4
e) 5
www.tecconcursos.com.br/questoes/1757163
Like all businesses, banks have had to act fast to respond to the unprecedented human and economic impact of Covid-19.
First, they needed to keep the lights on and ensure business continuity. Second, they had to meet the changing ways customers wanted to engage. Finally, they sought to
balance their business priorities with a responsibility to support society. Previous crises cast the banks as part of the problem — this time they are part of the solution.
Banks who have embraced modern banking technology have fared better in meeting these challenges. They’ve moved seamlessly to remote working, kept up service for
their customers, coped with huge increases in demand and quickly adapted their products. In contrast, banks using legacy ‘spaghetti’ software have struggled.
Covid-19 has accelerated the need for modern banking technology, but it didn’t create it. Before coronavirus, the 2020s were already being framed as the decade for digital
in the banking industry. Banks’ return on equity were too low and their cost-income ratios were too high. Meanwhile, regulation like open banking was disrupting the
industry and increasing competition from new entrants like the GAAFAs (Google, Amazon, Alibaba, Facebook, Apple).
Providing seamless digital customer experiences was therefore already a ‘must’. Every year, Temenos partners with the Economist Intelligence Unit (EIU) for a global study
on the future of banking. More than 300 banking leaders are interviewed from retail, commercial and private banks. Over half of these are at C-suite level.
In 2020, the study took place amid the Covid-19 crisis. The results give a fascinating insight into banking leaders’ approach during these unprecedented times. But they
also show how they see their industry in the years to come.
And the findings suggest three trends which will shape the future of banking:
1. New technologies will be the key driver of banking transformation over the next 5 years. 77% of respondents strongly believed that Artificial Intelligence (AI) will
be the most game-changing of these technologies. They see a diverse range of uses for AI — from personalised customer experience to fraud detection.
2. Banks will overhaul their business models to create digital ecosystems. 80% of respondents believe that banking will become part of a platform of services. 45%
are committed to transforming their business models into digital ecosystems.
3. The sun will set on branch banking. World Bank data shows that visits to branches have been steadily declining globally over the last decade. As a result of
coronavirus, customers are now more concerned about visiting their branch, and so even more people are willing to try digital applications. This combination of
pandemic and increasingly transformative advanced technology has led a majority of respondents (59%) to our survey with the EIU to state that traditional branch-
based banking model will be dead in just five years. That’s a 34% increase from last year.
The current environment is undoubtedly challenging for banks. But they have the capital, customer relationships and customer data. They are regulated. And most
importantly: they still enjoy their customers’ trust.
In short, banks are best-placed to succeed if they commit to end-to-end digital transformation. That means a fully digital front office which creates hyper-personalized
experiences and ecosystems. And a back office driving efficient operations and rapid innovation. By embracing modern banking technology, banks can support their
customers today, create new value for the future and drive new levels of future growth.
From the sentence of the last paragraph, “By embracing modern banking technology, banks can support their customers today, create new value for the future and drive
new levels of future growth”, it is inferred that
www.tecconcursos.com.br/questoes/1757448
WASHINGTON — American intelligence officials have found no evidence that aerial phenomena observed by Navy pilots in recent years are alien spacecraft, but they still
cannot explain the unusual movements that have mystified scientists and the military.
The report determines that a vast majority of more than 120 incidents over the past two decades did not originate from any American military or other advanced US
government technology, the officials said. That determination would appear to eliminate the possibility that Navy pilots who reported seeing unexplained aircraft might have
encountered programs the government meant to keep secret.
But that is about the only conclusive finding in the classified intelligence report, the officials said. And while a forthcoming unclassified version, expected to be released to
Congress by June 25, will present few other firm conclusions, senior officials briefed on the intelligence conceded that the very ambiguity of the findings meant the
government could not definitively rule out theories that the phenomena observed by military pilots might be alien spacecraft.
Americans’ long-running fascination with UFOs has intensified in recent weeks in anticipation of the release of the government report. Former President Barack Obama
encouraged the interest when he gave an interview last month about the incidents on “The Late Late Show with James Corden” on CBS.
“What is true, and I’m really being serious here,” Mr. Obama said, “is that there is film and records of objects in the skies that we don’t know exactly what they are.’’
The report concedes that much about the observed phenomena remains difficult to explain, including their acceleration, as well as ability to change direction and
submerge. One possible explanation — that the phenomena could be weather balloons or other research balloons — does not hold up in all cases, the officials said,
because of changes in wind speed at the times of some of the interactions.
Many of the more than 120 incidents examined in the report are from Navy personnel, officials said. The report also examined incidents involving foreign militaries over the
last two decades. Intelligence officials believe that at least some of the aerial phenomena could have been experimental technology from a rival power, most likely Russia
or China.
One senior official said without hesitation that U.S. officials knew it was not American technology. He said there was worry among intelligence and military officials that
China or Russia could be experimenting with hypersonic technology.
He and other officials spoke about the classified findings in the report on the condition of anonymity.
One of the purposes of the text is to confirm that the report determines the
www.tecconcursos.com.br/questoes/1757455
WASHINGTON — American intelligence officials have found no evidence that aerial phenomena observed by Navy pilots in recent years are alien spacecraft, but they still
cannot explain the unusual movements that have mystified scientists and the military.
The report determines that a vast majority of more than 120 incidents over the past two decades did not originate from any American military or other advanced US
government technology, the officials said. That determination would appear to eliminate the possibility that Navy pilots who reported seeing unexplained aircraft might have
encountered programs the government meant to keep secret.
But that is about the only conclusive finding in the classified intelligence report, the officials said. And while a forthcoming unclassified version, expected to be released to
Congress by June 25, will present few other firm conclusions, senior officials briefed on the intelligence conceded that the very ambiguity of the findings meant the
government could not definitively rule out theories that the phenomena observed by military pilots might be alien spacecraft.
Americans’ long-running fascination with UFOs has intensified in recent weeks in anticipation of the release of the government report. Former President Barack Obama
encouraged the interest when he gave an interview last month about the incidents on “The Late Late Show with James Corden” on CBS.
“What is true, and I’m really being serious here,” Mr. Obama said, “is that there is film and records of objects in the skies that we don’t know exactly what they are.’’
The report concedes that much about the observed phenomena remains difficult to explain, including their acceleration, as well as ability to change direction and
submerge. One possible explanation — that the phenomena could be weather balloons or other research balloons — does not hold up in all cases, the officials said,
because of changes in wind speed at the times of some of the interactions.
Many of the more than 120 incidents examined in the report are from Navy personnel, officials said. The report also examined incidents involving foreign militaries over the
last two decades. Intelligence officials believe that at least some of the aerial phenomena could have been experimental technology from a rival power, most likely Russia
or China.
One senior official said without hesitation that U.S. officials knew it was not American technology. He said there was worry among intelligence and military officials that
China or Russia could be experimenting with hypersonic technology.
He and other officials spoke about the classified findings in the report on the condition of anonymity.
In the 7th paragraph of the text, in the fragment “Intelligence officials believe that at least some of the aerial phenomena could have been experimental technology from a
rival power, most likely Russia or China”, the report’s authors express
a) strong desire
b) irrefutable fact
c) equivocal probability
d) reasonable possibility
e) unrealistic hypothesis
www.tecconcursos.com.br/questoes/1757457
WASHINGTON — American intelligence officials have found no evidence that aerial phenomena observed by Navy pilots in recent years are alien spacecraft, but they still
cannot explain the unusual movements that have mystified scientists and the military.
The report determines that a vast majority of more than 120 incidents over the past two decades did not originate from any American military or other advanced US
government technology, the officials said. That determination would appear to eliminate the possibility that Navy pilots who reported seeing unexplained aircraft might have
encountered programs the government meant to keep secret.
But that is about the only conclusive finding in the classified intelligence report, the officials said. And while a forthcoming unclassified version, expected to be released to
Congress by June 25, will present few other firm conclusions, senior officials briefed on the intelligence conceded that the very ambiguity of the findings meant the
government could not definitively rule out theories that the phenomena observed by military pilots might be alien spacecraft.
Americans’ long-running fascination with UFOs has intensified in recent weeks in anticipation of the release of the government report. Former President Barack Obama
encouraged the interest when he gave an interview last month about the incidents on “The Late Late Show with James Corden” on CBS.
“What is true, and I’m really being serious here,” Mr. Obama said, “is that there is film and records of objects in the skies that we don’t know exactly what they are.’’
The report concedes that much about the observed phenomena remains difficult to explain, including their acceleration, as well as ability to change direction and
submerge. One possible explanation — that the phenomena could be weather balloons or other research balloons — does not hold up in all cases, the officials said,
because of changes in wind speed at the times of some of the interactions.
Many of the more than 120 incidents examined in the report are from Navy personnel, officials said. The report also examined incidents involving foreign militaries over the
last two decades. Intelligence officials believe that at least some of the aerial phenomena could have been experimental technology from a rival power, most likely Russia
or China.
One senior official said without hesitation that U.S. officials knew it was not American technology. He said there was worry among intelligence and military officials that
China or Russia could be experimenting with hypersonic technology.
He and other officials spoke about the classified findings in the report on the condition of anonymity.
a) kept secrets.
b) hid their names.
c) invented stories.
d) omitted the truth.
e) said who they were.
www.tecconcursos.com.br/questoes/1757692
As we practice social distancing and businesses struggle to adapt, it’s no secret the unique challenges of Covid-19 are profoundly shaping our economic climate. U.S. Bank
financial industry and regulatory affairs expert Robert Schell explains what you need to know in this uncertain time.
Keep your financial habits as normal as possible during this time. Make online purchases, order takeout, pay bills and buy groceries. These everyday purchases put money
back into the economy and prevent it from dipping further into a recession.
www.tecconcursos.com.br/questoes/1757694
As we practice social distancing and businesses struggle to adapt, it’s no secret the unique challenges of Covid-19 are profoundly shaping our economic climate. U.S. Bank
financial industry and regulatory affairs expert Robert Schell explains what you need to know in this uncertain time.
Keep your financial habits as normal as possible during this time. Make online purchases, order takeout, pay bills and buy groceries. These everyday purchases put money
back into the economy and prevent it from dipping further into a recession.
In the 1st paragraph, in the fragment “it’s no secret the unique challenges of Covid-19 are profoundly shaping our economic climate”, the expression it’s no secret (that)
means
www.tecconcursos.com.br/questoes/1757698
As we practice social distancing and businesses struggle to adapt, it’s no secret the unique challenges of Covid-19 are profoundly shaping our economic climate. U.S. Bank
financial industry and regulatory affairs expert Robert Schell explains what you need to know in this uncertain time.
Keep your financial habits as normal as possible during this time. Make online purchases, order takeout, pay bills and buy groceries. These everyday purchases put money
back into the economy and prevent it from dipping further into a recession.
In the 4th paragraph, in the fragment “In March, the Federal Reserve cut rates drastically to boost economic activity”, the verb cut indicates a
a) habitual action repeatedly carried out by the Federal Reserve to address certain economic situations.
b) future action to be carried out by the Federal Reserve to address possible problems.
c) promised action to be carried out by the Federal Reserve to address the present economic challenges.
d) one-time action carried out by the Federal Reserve to address the present situation.
e) current action carried out by the Federal Reserve to address a permanent situation.
www.tecconcursos.com.br/questoes/613542
The shipping industry needs to design ships differently and be more technologically innovative to reach world climate goals and counter cybersecurity risks, it was agreed
at the annual Tripartite Shipbuilding Forum.
At the meeting in Nantong, China, held on November 1-3, the forum reached several general conclusions on ship design and technology.
This year’s themes were decarbonization of ships, safe design and digitalization. These issues are interlinked as they are all relevant to the creation of a more efficient
seaborne transport system.
At the end of two days of debate, it was concluded that the industry urgently needs new ship designs, equipment, propulsion systems and alternative fuels to achieve the
CO2 reduction goals established by the Paris Agreement on climate change, and the specific objectives to be established for international shipping by the IMO (International
Maritime Organization), a specialized agency of the United Nations, as part of its GHG (greenhouse gas) reduction strategy.
It was agreed that the industry needs to use all available technology to a much greater extent, and increase technological innovation to reduce CO2 emissions to the
ambitious degree required by the international community.
The Tripartite forum has therefore established inter-industry working groups with the aim of developing a better understanding of current R&D (research and
development) efforts for the new technologies needed by the shipping sector to realize its vision for zero CO2 emissions this century.
The participants hope that the general understandings reached at the meeting will send an important signal to all industry stakeholders about the vital role that everyone
must play to deliver the continuous improvement of shipping’s environmental performance now demanded by global society. The critical importance of the safety of
seafarers and ships which they operate were also part of the meeting’s agenda. As explained, there are increasing concerns that new regulations governing ship designs
aimed at further reducing CO2 emissions could potentially have adverse effects on the safe operation of ships.
www.tecconcursos.com.br/questoes/613543
The shipping industry needs to design ships differently and be more technologically innovative to reach world climate goals and counter cybersecurity risks, it was agreed
at the annual Tripartite Shipbuilding Forum.
At the meeting in Nantong, China, held on November 1-3, the forum reached several general conclusions on ship design and technology.
This year’s themes were decarbonization of ships, safe design and digitalization. These issues are interlinked as they are all relevant to the creation of a more efficient
seaborne transport system.
At the end of two days of debate, it was concluded that the industry urgently needs new ship designs, equipment, propulsion systems and alternative fuels to achieve the
CO2 reduction goals established by the Paris Agreement on climate change, and the specific objectives to be established for international shipping by the IMO (International
Maritime Organization), a specialized agency of the United Nations, as part of its GHG (greenhouse gas) reduction strategy.
It was agreed that the industry needs to use all available technology to a much greater extent, and increase technological innovation to reduce CO2 emissions to the
ambitious degree required by the international community.
The Tripartite forum has therefore established inter-industry working groups with the aim of developing a better understanding of current R&D (research and development)
efforts for the new technologies needed by the shipping sector to realize its vision for zero CO2 emissions this century.
The participants hope that the general understandings reached at the meeting will send an important signal to all industry stakeholders about the vital role that everyone
must play to deliver the continuous improvement of shipping’s environmental performance now demanded by global society. The critical importance of the safety of
seafarers and ships which they operate were also part of the meeting’s agenda. As explained, there are increasing concerns that new regulations governing ship designs
aimed at further reducing CO2 emissions could potentially have adverse effects on the safe operation of ships.
One example would be any legal requirements that led to a further reduction of engine power. The concern is that ships could get into problems during bad weather if the
engine is insufficiently powered, putting both the crew and the environment at serious risk.
Moreover, recent cyber attacks have increased awareness of potential threats facing the industry. When it comes to ship design and construction, it was generally agreed
that the industry needs to adopt new methods and standards to create more resilient digital systems on board. A more layered approach to a ship’s digital system and
greater segregation can increase safety, so that a single attack cannot readily spread to IT (information technology) and other systems both on board the ship and ashore.
The Tripartite forum agreed that in advance of its next meeting in Korea in 2018, the industry partners represented at Tripartite will work together to develop new design
standards, which will help raise the resilience of ships’ digital systems and make them more resistant to possible cyber-attacks.
e) excluded from the debates during the annual shipbuilding industry forum.
www.tecconcursos.com.br/questoes/613544
The shipping industry needs to design ships differently and be more technologically innovative to reach world climate goals and counter cybersecurity risks, it was agreed
at the annual Tripartite Shipbuilding Forum.
At the meeting in Nantong, China, held on November 1-3, the forum reached several general conclusions on ship design and technology.
This year’s themes were decarbonization of ships, safe design and digitalization. These issues are interlinked as they are all relevant to the creation of a more efficient
seaborne transport system.
At the end of two days of debate, it was concluded that the industry urgently needs new ship designs, equipment, propulsion systems and alternative fuels to achieve the
CO2 reduction goals established by the Paris Agreement on climate change, and the specific objectives to be established for international shipping by the IMO (International
Maritime Organization), a specialized agency of the United Nations, as part of its GHG (greenhouse gas) reduction strategy.
It was agreed that the industry needs to use all available technology to a much greater extent, and increase technological innovation to reduce CO2 emissions to the
ambitious degree required by the international community.
The Tripartite forum has therefore established inter-industry working groups with the aim of developing a better understanding of current R&D (research and development)
efforts for the new technologies needed by the shipping sector to realize its vision for zero CO2 emissions this century.
The participants hope that the general understandings reached at the meeting will send an important signal to all industry stakeholders about the vital role that everyone
must play to deliver the continuous improvement of shipping’s environmental performance now demanded by global society. The critical importance of the safety of
seafarers and ships which they operate were also part of the meeting’s agenda. As explained, there are increasing concerns that new regulations governing ship designs
aimed at further reducing CO2 emissions could potentially have adverse effects on the safe operation of ships.
One example would be any legal requirements that led to a further reduction of engine power. The concern is that ships could get into problems during bad weather if the
engine is insufficiently powered, putting both the crew and the environment at serious risk.
Moreover, recent cyber attacks have increased awareness of potential threats facing the industry. When it comes to ship design and construction, it was generally agreed
that the industry needs to adopt new methods and standards to create more resilient digital systems on board. A more layered approach to a ship’s digital system and
greater segregation can increase safety, so that a single attack cannot readily spread to IT (information technology) and other systems both on board the ship and ashore.
The Tripartite forum agreed that in advance of its next meeting in Korea in 2018, the industry partners represented at Tripartite will work together to develop new design
standards, which will help raise the resilience of ships’ digital systems and make them more resistant to possible cyber-attacks.
According to the fragment “the industry needs to use all available technology to a much greater extent”, the use of technology in modern ship design is
a) optional
b) irrelevant
c) controversial
e) highly recommended
www.tecconcursos.com.br/questoes/613545
The shipping industry needs to design ships differently and be more technologically innovative to reach world climate goals and counter cybersecurity risks, it was agreed
at the annual Tripartite Shipbuilding Forum.
At the meeting in Nantong, China, held on November 1-3, the forum reached several general conclusions on ship design and technology.
This year’s themes were decarbonization of ships, safe design and digitalization. These issues are interlinked as they are all relevant to the creation of a more efficient
seaborne transport system.
At the end of two days of debate, it was concluded that the industry urgently needs new ship designs, equipment, propulsion systems and alternative fuels to achieve the
CO2 reduction goals established by the Paris Agreement on climate change, and the specific objectives to be established for international shipping by the IMO (International
Maritime Organization), a specialized agency of the United Nations, as part of its GHG (greenhouse gas) reduction strategy.
It was agreed that the industry needs to use all available technology to a much greater extent, and increase technological innovation to reduce CO2 emissions to the
ambitious degree required by the international community.
The Tripartite forum has therefore established inter-industry working groups with the aim of developing a better understanding of current R&D (research and development)
efforts for the new technologies needed by the shipping sector to realize its vision for zero CO2 emissions this century.
The participants hope that the general understandings reached at the meeting will send an important signal to all industry stakeholders about the vital role that everyone
must play to deliver the continuous improvement of shipping’s environmental performance now demanded by global society. The critical importance of the safety of
seafarers and ships which they operate were also part of the meeting’s agenda. As explained, there are increasing concerns that new regulations governing ship designs
aimed at further reducing CO2 emissions could potentially have adverse effects on the safe operation of ships.
One example would be any legal requirements that led to a further reduction of engine power. The concern is that ships could get into problems during bad weather if the
engine is insufficiently powered, putting both the crew and the environment at serious risk.
Moreover, recent cyber attacks have increased awareness of potential threats facing the industry. When it comes to ship design and construction, it was generally agreed
that the industry needs to adopt new methods and standards to create more resilient digital systems on board. A more layered approach to a ship’s digital system and
greater segregation can increase safety, so that a single attack cannot readily spread to IT (information technology) and other systems both on board the ship and ashore.
The Tripartite forum agreed that in advance of its next meeting in Korea in 2018, the industry partners represented at Tripartite will work together to develop new design
standards, which will help raise the resilience of ships’ digital systems and make them more resistant to possible cyber-attacks.
www.tecconcursos.com.br/questoes/613547
The shipping industry needs to design ships differently and be more technologically innovative to reach world climate goals and counter cybersecurity risks, it was agreed
at the annual Tripartite Shipbuilding Forum.
At the meeting in Nantong, China, held on November 1-3, the forum reached several general conclusions on ship design and technology.
This year’s themes were decarbonization of ships, safe design and digitalization. These issues are interlinked as they are all relevant to the creation of a more efficient
seaborne transport system.
At the end of two days of debate, it was concluded that the industry urgently needs new ship designs, equipment, propulsion systems and alternative fuels to achieve the
CO2 reduction goals established by the Paris Agreement on climate change, and the specific objectives to be established for international shipping by the IMO (International
Maritime Organization), a specialized agency of the United Nations, as part of its GHG (greenhouse gas) reduction strategy.
It was agreed that the industry needs to use all available technology to a much greater extent, and increase technological innovation to reduce CO2 emissions to the
ambitious degree required by the international community.
The Tripartite forum has therefore established inter-industry working groups with the aim of developing a better understanding of current R&D (research and development)
efforts for the new technologies needed by the shipping sector to realize its vision for zero CO2 emissions this century.
The participants hope that the general understandings reached at the meeting will send an important signal to all industry stakeholders about the vital role that everyone
must play to deliver the continuous improvement of shipping’s environmental performance now demanded by global society. The critical importance of the safety of
seafarers and ships which they operate were also part of the meeting’s agenda. As explained, there are increasing concerns that new regulations governing ship designs
aimed at further reducing CO2 emissions could potentially have adverse effects on the safe operation of ships.
One example would be any legal requirements that led to a further reduction of engine power. The concern is that ships could get into problems during bad weather if the
engine is insufficiently powered, putting both the crew and the environment at serious risk.
Moreover, recent cyber attacks have increased awareness of potential threats facing the industry. When it comes to ship design and construction, it was generally agreed
that the industry needs to adopt new methods and standards to create more resilient digital systems on board. A more layered approach to a ship’s digital system and
greater segregation can increase safety, so that a single attack cannot readily spread to IT (information technology) and other systems both on board the ship and ashore.
The Tripartite forum agreed that in advance of its next meeting in Korea in 2018, the industry partners represented at Tripartite will work together to develop new design
standards, which will help raise the resilience of ships’ digital systems and make them more resistant to possible cyber-attacks.
Based on the text, “the continuous improvement of shipping’s environmental performance” is a request from the
a) seafarers
b) global society
c) industry stakeholders
www.tecconcursos.com.br/questoes/613564
The shipping industry needs to design ships differently and be more technologically innovative to reach world climate goals and counter cybersecurity risks, it was agreed
at the annual Tripartite Shipbuilding Forum.
At the meeting in Nantong, China, held on November 1-3, the forum reached several general conclusions on ship design and technology.
This year’s themes were decarbonization of ships, safe design and digitalization. These issues are interlinked as they are all relevant to the creation of a more efficient
seaborne transport system.
At the end of two days of debate, it was concluded that the industry urgently needs new ship designs, equipment, propulsion systems and alternative fuels to achieve the
CO2 reduction goals established by the Paris Agreement on climate change, and the specific objectives to be established for international shipping by the IMO (International
Maritime Organization), a specialized agency of the United Nations, as part of its GHG (greenhouse gas) reduction strategy.
The Tripartite forum has therefore established inter-industry working groups with the aim of developing a better understanding of current R&D (research and development)
efforts for the new technologies needed by the shipping sector to realize its vision for zero CO2 emissions this century.
The participants hope that the general understandings reached at the meeting will send an important signal to all industry stakeholders about the vital role that everyone
must play to deliver the continuous improvement of shipping’s environmental performance now demanded by global society. The critical importance of the safety of
seafarers and ships which they operate were also part of the meeting’s agenda. As explained, there are increasing concerns that new regulations governing ship designs
aimed at further reducing CO2 emissions could potentially have adverse effects on the safe operation of ships.
One example would be any legal requirements that led to a further reduction of engine power. The concern is that ships could get into problems during bad weather if the
engine is insufficiently powered, putting both the crew and the environment at serious risk.
Moreover, recent cyber attacks have increased awareness of potential threats facing the industry. When it comes to ship design and construction, it was generally agreed
that the industry needs to adopt new methods and standards to create more resilient digital systems on board. A more layered approach to a ship’s digital system and
greater segregation can increase safety, so that a single attack cannot readily spread to IT (information technology) and other systems both on board the ship and ashore.
The Tripartite forum agreed that in advance of its next meeting in Korea in 2018, the industry partners represented at Tripartite will work together to develop new design
standards, which will help raise the resilience of ships’ digital systems and make them more resistant to possible cyber-attacks.
www.tecconcursos.com.br/questoes/613569
The shipping industry needs to design ships differently and be more technologically innovative to reach world climate goals and counter cybersecurity risks, it was agreed
at the annual Tripartite Shipbuilding Forum.
At the meeting in Nantong, China, held on November 1-3, the forum reached several general conclusions on ship design and technology.
This year’s themes were decarbonization of ships, safe design and digitalization. These issues are interlinked as they are all relevant to the creation of a more efficient
seaborne transport system.
At the end of two days of debate, it was concluded that the industry urgently needs new ship designs, equipment, propulsion systems and alternative fuels to achieve the
CO2 reduction goals established by the Paris Agreement on climate change, and the specific objectives to be established for international shipping by the IMO (International
Maritime Organization), a specialized agency of the United Nations, as part of its GHG (greenhouse gas) reduction strategy.
It was agreed that the industry needs to use all available technology to a much greater extent, and increase technological innovation to reduce CO2 emissions to the
ambitious degree required by the international community.
The Tripartite forum has therefore established inter-industry working groups with the aim of developing a better understanding of current R&D (research and
development) efforts for the new technologies needed by the shipping sector to realize its vision for zero CO2 emissions this century.
The participants hope that the general understandings reached at the meeting will send an important signal to all industry stakeholders about the vital role that everyone
must play to deliver the continuous improvement of shipping’s environmental performance now demanded by global society. The critical importance of the safety of
seafarers and ships which they operate were also part of the meeting’s agenda. As explained, there are increasing concerns that new regulations governing ship designs
aimed at further reducing CO2 emissions could potentially have adverse effects on the safe operation of ships.
One example would be any legal requirements that led to a further reduction of engine power. The concern is that ships could get into problems during bad weather if
the engine is insufficiently powered, putting both the crew and the environment at serious risk.
Moreover, recent cyber attacks have increased awareness of potential threats facing the industry. When it comes to ship design and construction, it was generally agreed
that the industry needs to adopt new methods and standards to create more resilient digital systems on board. A more layered approach to a ship’s digital system and
greater segregation can increase safety, so that a single attack cannot readily spread to IT (information technology) and other systems both on board the ship and ashore.
The Tripartite forum agreed that in advance of its next meeting in Korea in 2018, the industry partners represented at Tripartite will work together to develop new design
standards, which will help raise the resilience of ships’ digital systems and make them more resistant to possible cyber-attacks.
Available at: <http://worldmaritimenews.com/archives/236231/ forum-industry-needs-to-design-ships-differently/>. Retrieved on: Dec. 2, 2017. Adapted.
According to last paragraph, the ship industry partners represented at the Tripartite forum will cooperate to
www.tecconcursos.com.br/questoes/615413
On container ships, cargo is carried in standardized containers, which are placed one over the other and secured using lashing.
While at sea, the ship is subjected to heavy rolling and pitching, which can not only disturb the cargo but also upset the stability of the ship. Parametric rolling – a unique
phenomenon on container ships, must be carefully dealt with in order to ensure safety of cargo containers at sea.
Keeping a watch on the loaded cargo containers when the container ship is sailing is as equally important as preparing a container ship for loading cargo. Also, officers
must know all the important equipment tools which are used to handle cargo on container ships.
The following important points must be considered for taking care of cargo containers while at sea:
Check lashing
Proper container lashing is one of the most important aspects of securing cargo safely on the ships. Every officer in charge of cargo loading and unloading must know
and understand the important points for safe container lashing.
Moreover, when the ship is sailing, lashing must be checked at least once a day and tightened whenever necessary.
If the ship is about to enter rough sea or in case of heavy weather, lashing should be frequently checked and additional lashing must be provided wherever required.
e) reveal all the risks to cargo caused by leakages from water and oil systems.
www.tecconcursos.com.br/questoes/615423
On container ships, cargo is carried in standardized containers, which are placed one over the other and secured using lashing.
While at sea, the ship is subjected to heavy rolling and pitching, which can not only disturb the cargo but also upset the stability of the ship. Parametric rolling – a unique
phenomenon on container ships, must be carefully dealt with in order to ensure safety of cargo containers at sea.
Keeping a watch on the loaded cargo containers when the container ship is sailing is as equally important as preparing a container ship for loading
cargo. Also, officers must know all the important equipment tools which are used to handle cargo on container ships.
The following important points must be considered for taking care of cargo containers while at sea:
Check lashing
Proper container lashing is one of the most important aspects of securing cargo safely on the ships. Every officer in charge of cargo loading and unloading must know
and understand the important points for safe container lashing.
Moreover, when the ship is sailing, lashing must be checked at least once a day and tightened whenever necessary.
If the ship is about to enter rough sea or in case of heavy weather, lashing should be frequently checked and additional lashing must be provided wherever required.
According to highlighted section, paying attention to the loaded cargo containers while at sea and preparing a container ship for loading cargo are
b) of equal relevance
www.tecconcursos.com.br/questoes/615446
On container ships, cargo is carried in standardized containers, which are placed one over the other and secured using lashing.
While at sea, the ship is subjected to heavy rolling and pitching, which can not only disturb the cargo but also upset the stability of the ship. Parametric rolling – a unique
phenomenon on container ships, must be carefully dealt with in order to ensure safety of cargo containers at sea.
Keeping a watch on the loaded cargo containers when the container ship is sailing is as equally important as preparing a container ship for loading cargo. Also, officers
must know all the important equipment tools which are used to handle cargo on container ships.
The following important points must be considered for taking care of cargo containers while at sea:
Check lashing
Proper container lashing is one of the most important aspects of securing cargo safely on the ships. Every officer in charge of cargo loading and
unloading must know and understand the important points for safe container lashing.
Moreover, when the ship is sailing, lashing must be checked at least once a day and tightened whenever necessary.
If the ship is about to enter rough sea or in case of heavy weather, lashing should be frequently checked and additional lashing must be provided
wherever required.
www.tecconcursos.com.br/questoes/615451
On container ships, cargo is carried in standardized containers, which are placed one over the other and secured using lashing.
While at sea, the ship is subjected to heavy rolling and pitching, which can not only disturb the cargo but also upset the stability of the ship. Parametric rolling – a unique
phenomenon on container ships, must be carefully dealt with in order to ensure safety of cargo containers at sea.
Keeping a watch on the loaded cargo containers when the container ship is sailing is as equally important as preparing a container ship for loading cargo. Also, officers
must know all the important equipment tools which are used to handle cargo on container ships.
The following important points must be considered for taking care of cargo containers while at sea:
Check lashing
Proper container lashing is one of the most important aspects of securing cargo safely on the ships. Every officer in charge of cargo loading and unloading must know
and understand the important points for safe container lashing.
Moreover, when the ship is sailing, lashing must be checked at least once a day and tightened whenever necessary.
If the ship is about to enter rough sea or in case of heavy weather, lashing should be frequently checked and additional lashing must be provided wherever required.
e) malfunctioning containers
www.tecconcursos.com.br/questoes/615458
On container ships, cargo is carried in standardized containers, which are placed one over the other and secured using lashing.
While at sea, the ship is subjected to heavy rolling and pitching, which can not only disturb the cargo but also upset the stability of the ship. Parametric rolling – a unique
phenomenon on container ships, must be carefully dealt with in order to ensure safety of cargo containers at sea.
Keeping a watch on the loaded cargo containers when the container ship is sailing is as equally important as preparing a container ship for loading cargo. Also, officers
must know all the important equipment tools which are used to handle cargo on container ships.
The following important points must be considered for taking care of cargo containers while at sea:
Check lashing
Proper container lashing is one of the most important aspects of securing cargo safely on the ships. Every officer in charge of cargo loading and unloading must know
and understand the important points for safe container lashing.
Moreover, when the ship is sailing, lashing must be checked at least once a day and tightened whenever necessary.
If the ship is about to enter rough sea or in case of heavy weather, lashing should be frequently checked and additional lashing must be provided wherever required.
According to the text, all the following points must be considered for keeping a watch on cargo containers while at sea, EXCEPT
www.tecconcursos.com.br/questoes/618967
The resurgence in oil and gas production from the United States, deep declines in the cost of renewables and growing electrification are changing the face of the global
energy system and upending traditional ways of meeting energy demand, according to the World Energy Outlook 2017. A cleaner and more diversified energy mix in China
is another major driver of this transformation.
Over the next 25 years, the world’s growing energy needs are met first by renewables and natural gas, as fast-declining costs turn solar power into the cheapest source
of new electricity generation. Global energy demand is 30% higher by 2040 — but still half as much as it would have been without efficiency improvements. The boom
years for coal are over — in the absence of large-scale carbon capture, utilization and storage (CCUS) — and rising oil demand slows down but is not reversed before
2040 even as electric-car sales rise steeply.
WEO-2017, the International Energy Agency (IEA)’s flagship publication, finds that over the next two decades the global energy system is being reshaped by four major
forces: the United States is set to become the undisputed global oil and gas leader; renewables are being deployed rapidly thanks to falling costs; the share of electricity in
the energy mix is growing; and China’s new economic strategy takes it on a cleaner growth mode, with implications for global energy markets.
Solar PV is set to lead capacity additions, pushed by deployment in China and India, meanwhile in the European Union, wind becomes the leading source of electricity
soon after 2030.
“Solar is forging ahead in global power markets as it becomes the cheapest source of electricity generation in many places, including China and India,” said Dr Fatih
Birol, the IEA’s executive director. “Electric vehicles (EVs) are in the fast lane as a result of government support and declining battery costs but it is far too early to write the
obituary of oil, as growth for trucks, petrochemicals, shipping and aviation keep pushing demand higher. The US becomes the undisputed leader for oil and gas production
for decades, which represents a major upheaval for international market dynamics.”
These themes — as well as the future role of oil and gas in the energy mix, how clean-energy technologies are deploying, and the need for more investment in CCUS —
were among the key topics discussed by the world’s energy leaders at the IEA’s 2017 Ministerial Meeting in Paris last week.
This year, WEO-2017 includes a special focus on China, where economic and energy policy changes underway will have a profound impact on the country’s energy mix,
and continue to shape global trends. A new phase in the country’s development results in an economy that is less reliant on heavy industry and coal.
At the same time, a strong emphasis on cleaner energy technologies, in large part to address poor air quality, is catapulting China to a position as a world leader in
wind, solar, nuclear and electric vehicles and the source of more than a quarter of projected growth in natural gas consumption. As demand growth in China slows, other
countries continue to push overall global demand higher – with India accounting for almost one-third of global growth to 2040.
The shale oil and gas revolution in the United States continues thanks to the remarkable ability of producers to unlock new resources in a cost-effective way. By the mid-
2020s, the United States is projected to become the world’s largest LNG exporter and a net oil exporter by the end of that decade.
This is having a major impact on oil and gas markets, challenging incumbent suppliers and provoking a major reorientation of global trade flows, with consumers in Asia
accounting for more than 70% of global oil and gas imports by 2040. LNG from the United States is also accelerating a major structural shift towards a more flexible and
globalized gas market.
WEO-2017 finds it is too early to write the obituary of oil. Global oil demand continues to grow to 2040, although at a steadily decreasing pace – while fuel efficiency and
rising electrification bring a peak in oil used for passenger cars, even with a doubling of the car fleet to two billion. But other sectors – namely petrochemicals, trucks,
aviation, and shipping – drive up oil demand to 105 million barrels a day by 2040. While carbon emissions have flattened in recent years, the report finds that global
energy-related CO2 emissions increase slightly by 2040, but at a slower pace than in last year’s projections. Still, this is far from enough to avoid severe impacts of climate
change.
Available at: <https://www.iea.org/newsroom/news/2017/ november/a-world-in-transformation-world-energyoutlook-2017.html>. Retrieved on: 14 Nov. 2017. Adapted.
a) predict the imminent decrease of global oil demands in the near future.
c) report on the increasing role of renewable energy sources and natural gas.
d) discuss how China’s economic and energy policy changes may shape global trends.
e) anticipate how the US, China and India will transform the global energy system in the next decade.
www.tecconcursos.com.br/questoes/618969
The resurgence in oil and gas production from the United States, deep declines in the cost of renewables and growing electrification are changing the face of the global
energy system and upending traditional ways of meeting energy demand, according to the World Energy Outlook 2017. A cleaner and more diversified energy mix in China
is another major driver of this transformation.
Over the next 25 years, the world’s growing energy needs are met first by renewables and natural gas, as fast-declining costs turn solar power into the cheapest source
of new electricity generation. Global energy demand is 30% higher by 2040 — but still half as much as it would have been without efficiency improvements. The boom
years for coal are over — in the absence of large-scale carbon capture, utilization and storage (CCUS) — and rising oil demand slows down but is not reversed before
2040 even as electric-car sales rise steeply.
WEO-2017, the International Energy Agency (IEA)’s flagship publication, finds that over the next two decades the global energy system is being reshaped by four major
forces: the United States is set to become the undisputed global oil and gas leader; renewables are being deployed rapidly thanks to falling costs; the share of electricity in
the energy mix is growing; and China’s new economic strategy takes it on a cleaner growth mode, with implications for global energy markets.
Solar PV is set to lead capacity additions, pushed by deployment in China and India, meanwhile in the European Union, wind becomes the leading source of electricity
soon after 2030.
“Solar is forging ahead in global power markets as it becomes the cheapest source of electricity generation in many places, including China and India,” said Dr Fatih
Birol, the IEA’s executive director. “Electric vehicles (EVs) are in the fast lane as a result of government support and declining battery costs but it is far too early to write the
obituary of oil, as growth for trucks, petrochemicals, shipping and aviation keep pushing demand higher. The US becomes the undisputed leader for oil and gas production
for decades, which represents a major upheaval for international market dynamics.”
These themes — as well as the future role of oil and gas in the energy mix, how clean-energy technologies are deploying, and the need for more investment in CCUS —
were among the key topics discussed by the world’s energy leaders at the IEA’s 2017 Ministerial Meeting in Paris last week.
This year, WEO-2017 includes a special focus on China, where economic and energy policy changes underway will have a profound impact on the country’s energy mix,
and continue to shape global trends. A new phase in the country’s development results in an economy that is less reliant on heavy industry and coal.
At the same time, a strong emphasis on cleaner energy technologies, in large part to address poor air quality, is catapulting China to a position as a world leader in
wind, solar, nuclear and electric vehicles and the source of more than a quarter of projected growth in natural gas consumption. As demand growth in China slows, other
countries continue to push overall global demand higher – with India accounting for almost one-third of global growth to 2040.
The shale oil and gas revolution in the United States continues thanks to the remarkable ability of producers to unlock new resources in a cost-effective way. By the mid-
2020s, the United States is projected to become the world’s largest LNG exporter and a net oil exporter by the end of that decade.
This is having a major impact on oil and gas markets, challenging incumbent suppliers and provoking a major reorientation of global trade flows, with consumers in Asia
accounting for more than 70% of global oil and gas imports by 2040. LNG from the United States is also accelerating a major structural shift towards a more flexible and
globalized gas market.
WEO-2017 finds it is too early to write the obituary of oil. Global oil demand continues to grow to 2040, although at a steadily decreasing pace – while fuel efficiency and
rising electrification bring a peak in oil used for passenger cars, even with a doubling of the car fleet to two billion. But other sectors – namely petrochemicals, trucks,
aviation, and shipping – drive up oil demand to 105 million barrels a day by 2040. While carbon emissions have flattened in recent years, the report finds that global
energy-related CO2 emissions increase slightly by 2040, but at a slower pace than in last year’s projections. Still, this is far from enough to avoid severe impacts of climate
change.
According to Text, one of the themes discussed at the IEA’s 2017 Ministerial Meeting in Paris was the
a) insufficient investment in clean-energy technologies.
e) limited use of EVs due to battery prices and lack of financial help from the government.
www.tecconcursos.com.br/questoes/618971
The resurgence in oil and gas production from the United States, deep declines in the cost of renewables and growing electrification are changing the face of the global
energy system and upending traditional ways of meeting energy demand, according to the World Energy Outlook 2017. A cleaner and more diversified energy mix in China
is another major driver of this transformation.
Over the next 25 years, the world’s growing energy needs are met first by renewables and natural gas, as fast-declining costs turn solar power into the cheapest source
of new electricity generation. Global energy demand is 30% higher by 2040 — but still half as much as it would have been without efficiency improvements. The boom
years for coal are over — in the absence of large-scale carbon capture, utilization and storage (CCUS) — and rising oil demand slows down but is not reversed before
2040 even as electric-car sales rise steeply.
WEO-2017, the International Energy Agency (IEA)’s flagship publication, finds that over the next two decades the global energy system is being reshaped by four major
forces: the United States is set to become the undisputed global oil and gas leader; renewables are being deployed rapidly thanks to falling costs; the share of electricity in
the energy mix is growing; and China’s new economic strategy takes it on a cleaner growth mode, with implications for global energy markets.
Solar PV is set to lead capacity additions, pushed by deployment in China and India, meanwhile in the European Union, wind becomes the leading source of electricity
soon after 2030.
“Solar is forging ahead in global power markets as it becomes the cheapest source of electricity generation in many places, including China and India,” said Dr Fatih
Birol, the IEA’s executive director. “Electric vehicles (EVs) are in the fast lane as a result of government support and declining battery costs but it is far too early to write the
obituary of oil, as growth for trucks, petrochemicals, shipping and aviation keep pushing demand higher. The US becomes the undisputed leader for oil and gas production
for decades, which represents a major upheaval for international market dynamics.”
These themes — as well as the future role of oil and gas in the energy mix, how clean-energy technologies are deploying, and the need for more investment in CCUS —
were among the key topics discussed by the world’s energy leaders at the IEA’s 2017 Ministerial Meeting in Paris last week.
This year, WEO-2017 includes a special focus on China, where economic and energy policy changes underway will have a profound impact on the country’s energy mix,
and continue to shape global trends. A new phase in the country’s development results in an economy that is less reliant on heavy industry and coal.
At the same time, a strong emphasis on cleaner energy technologies, in large part to address poor air quality, is catapulting China to a position as a world leader in
wind, solar, nuclear and electric vehicles and the source of more than a quarter of projected growth in natural gas consumption. As demand growth in China slows, other
countries continue to push overall global demand higher – with India accounting for almost one-third of global growth to 2040.
The shale oil and gas revolution in the United States continues thanks to the remarkable ability of producers to unlock new resources in a cost-effective way. By the mid-
2020s, the United States is projected to become the world’s largest LNG exporter and a net oil exporter by the end of that decade.
This is having a major impact on oil and gas markets, challenging incumbent suppliers and provoking a major reorientation of global trade flows, with consumers in Asia
accounting for more than 70% of global oil and gas imports by 2040. LNG from the United States is also accelerating a major structural shift towards a more flexible and
globalized gas market.
WEO-2017 finds it is too early to write the obituary of oil. Global oil demand continues to grow to 2040, although at a steadily decreasing pace – while fuel efficiency and
rising electrification bring a peak in oil used for passenger cars, even with a doubling of the car fleet to two billion. But other sectors – namely petrochemicals, trucks,
aviation, and shipping – drive up oil demand to 105 million barrels a day by 2040. While carbon emissions have flattened in recent years, the report finds that global
energy-related CO2 emissions increase slightly by 2040, but at a slower pace than in last year’s projections. Still, this is far from enough to avoid severe impacts of climate
change.
Available at: <https://www.iea.org/newsroom/news/2017/ november/a-world-in-transformation-world-energyoutlook-2017.html>. Retrieved on: 14 Nov. 2017. Adapted.
According to Text, WEO-2017 includes a special focus on China because this country has been
b) blamed for substituting heavy industry and coal for cleaner energy.
d) an undeniable world leader in the areas of wind, solar and nuclear energy.
e) facing changes in the economic and energy policy that will deeply influence its energy mix.
www.tecconcursos.com.br/questoes/619271
The resurgence in oil and gas production from the United States, deep declines in the cost of renewables and growing electrification are changing the face of the global
energy system and upending traditional ways of meeting energy demand, according to the World Energy Outlook 2017. A cleaner and more diversified energy mix in China
is another major driver of this transformation.
Over the next 25 years, the world’s growing energy needs are met first by renewables and natural gas, as fast-declining costs turn solar power into the cheapest source
of new electricity generation. Global energy demand is 30% higher by 2040 — but still half as much as it would have been without efficiency improvements. The boom
years for coal are over — in the absence of large-scale carbon capture, utilization and storage (CCUS) — and rising oil demand slows down but is not reversed before
2040 even as electric-car sales rise steeply.
WEO-2017, the International Energy Agency (IEA)’s flagship publication, finds that over the next two decades the global energy system is being reshaped by four major
forces: the United States is set to become the undisputed global oil and gas leader; renewables are being deployed rapidly thanks to falling costs; the share of electricity in
the energy mix is growing; and China’s new economic strategy takes it on a cleaner growth mode, with implications for global energy markets.
Solar PV is set to lead capacity additions, pushed by deployment in China and India, meanwhile in the European Union, wind becomes the leading source of electricity
soon after 2030.
a) “The resurgence in oil and gas production from the United States, deep declines in the cost of renewables and growing electrification are changing the face of
the global energy system”
b) “the world’s growing energy needs are met first by renewables and natural gas as fast-declining costs turn solar power into the cheapest source of new
electricity generation”
c) “WEO-2017 (…) finds that over the next two decades the global energy system is being reshaped by four major forces”
d) “meanwhile in the European Union, wind becomes the leading source of electricity soon after 2030”
e) “the United States is projected to become the world’s largest LNG exporter and a net oil exporter by the end of that decade.”
www.tecconcursos.com.br/questoes/619301
BRASILIA – The International Energy Agency and Brazil jointly announced today that the country joined the IEA as an Association country, opening new avenues for
cooperation towards a more secure and sustainable energy future with Latin America’s largest country.
“With today’s announcement of IEA Association, we are taking another important step to place Brazil at the centre of global debate on key energy policy issues including
renewable energy, energy efficiency, rational use of fossil fuels, energy security and sustainable development,” said Fernando Coelho Filho, Minister of Mines and Energy
Brazil’s leading expertise in bioenergy, hydro and other forms of clean and conventional energy is recognized around the world, and provides an excellent basis to
develop solutions for global energy challenges. The country’s experience in managing renewable resources in its energy mix can contribute greatly to IEA discussions on a
broadened concept of energy security. Brazil has also pioneered the use of auctions for long-term contracts for renewable energy, a model that is now successfully applied
as best-practice world-wide.
Brazil and the IEA plan to work jointly across a wide range of energy-related activities. These include implemention of The Biofuture Platform, which aims to promote
international coordination on advanced low carbon fuels. The IEA will also support the development of Brazil’s ten-year energy efficiency plan and co-host an energy
efficiency training event in Brazil to share regional and global experiences.
“Brazil’s experience shows that policies do matter,” said Dr Fatih Birol, the IEA’s Executive Director. “Its determined and ambitious long-term energy policies, developing
deep-water oil resources and expanding biofuels output, set an example to countries around the world. As a result, our latest data shows that Brazil will become a net oil
exporter this year, the first major consumer in recent history to ever achieve such a turnaround.”
Dr Birol also congratulated Brazil for its recent successful deepwater bid round. After depending on oil imports since IEA records began in the 1970s, the IEA now finds
that Brazil will become a net exporter this year, and exporting nearly one million barrels of oil per day to world markets by 2022. This is the result of a 50% increase in oil
production in the past decade thanks to a successful push into deep-water production, and a biofuels programme that has helped keep domestic oil-demand growth under
control.
With Brazil, the IEA family now accounts for over 70% of the world’s total energy consumption, compared with less than 40% just two years ago. The seven IEA
Association countries are Brazil, China, India, Indonesia, Morocco, Singapore and Thailand.
The agreement will allow the IEA to benefit from Brazil’s unique experience, which has enabled it to develop one of the cleanest energy mixes in the world. Thanks to its
expertise in global energy market and policy analysis, the IEA can support Brazil’s efforts and collaborate in its energy transition.
Available at: <https://www.iea.org/newsroom/news/2017/ october/brazil-joins-iea-as-an-association-country-reshaping-international-energy-govern.html>. Retrieved on: 31 Oct. 2017. Adapted.
The main intention of Text is to discuss the Brazilian
a) joint effort with the IEA in order to implement The Biofuture Platform in the near future.
c) association with the IEA to replicate the use of auctions for renewable energies worldwide.
d) strategic partnership with the IEA in the field of energy aiming at a safer and sustainable future.
e) ten-year energy efficiency plan and the sharing of its regional and global experiences with Latin American countries.
www.tecconcursos.com.br/questoes/619304
Dr. Fatih Birol affirms that “Brazil’s experience shows that policies do matter” because, due to its long-term energy policies, the country
a) was about to change its position from a major oil consumer into that of a net oil exporter.
b) could dramatically increase oil exports to nearly one million barrels per day to world markets.
c) was able to expand its deep-water oil resources and restrict biofuels output in the recent years.
d) implemented a rewarding biofuels programme that helped reduce national oil-demand growth.
e) succeeded in doubling its oil production in the last few years as the result of an outstanding increase in deep-water production.
www.tecconcursos.com.br/questoes/619315
The resurgence in oil and gas production from the United States, deep declines in the cost of renewables and growing electrification are changing the face of the global
energy system and upending traditional ways of meeting energy demand, according to the World Energy Outlook 2017. A cleaner and more diversified energy mix in China
is another major driver of this transformation.
Over the next 25 years, the world’s growing energy needs are met first by renewables and natural gas, as fast-declining costs turn solar power into the cheapest source
of new electricity generation. Global energy demand is 30% higher by 2040 — but still half as much as it would have been without efficiency improvements. The boom
years for coal are over — in the absence of large-scale carbon capture, utilization and storage (CCUS) — and rising oil demand slows down but is not reversed before
2040 even as electric-car sales rise steeply.
WEO-2017, the International Energy Agency (IEA)’s flagship publication, finds that over the next two decades the global energy system is being reshaped by four major
forces: the United States is set to become the undisputed global oil and gas leader; renewables are being deployed rapidly thanks to falling costs; the share of electricity in
the energy mix is growing; and China’s new economic strategy takes it on a cleaner growth mode, with implications for global energy markets.
Solar PV is set to lead capacity additions, pushed by deployment in China and India, meanwhile in the European Union, wind becomes the leading source of electricity
soon after 2030.
“Solar is forging ahead in global power markets as it becomes the cheapest source of electricity generation in many places, including China and India,” said Dr Fatih
Birol, the IEA’s executive director. “Electric vehicles (EVs) are in the fast lane as a result of government support and declining battery costs but it is far too early to write the
obituary of oil, as growth for trucks, petrochemicals, shipping and aviation keep pushing demand higher. The US becomes the undisputed leader for oil and gas production
for decades, which represents a major upheaval for international market dynamics.”
These themes — as well as the future role of oil and gas in the energy mix, how clean-energy technologies are deploying, and the need for more investment in CCUS —
were among the key topics discussed by the world’s energy leaders at the IEA’s 2017 Ministerial Meeting in Paris last week.
This year, WEO-2017 includes a special focus on China, where economic and energy policy changes underway will have a profound impact on the country’s energy mix,
and continue to shape global trends. A new phase in the country’s development results in an economy that is less reliant on heavy industry and coal.
At the same time, a strong emphasis on cleaner energy technologies, in large part to address poor air quality, is catapulting China to a position as a world leader in
wind, solar, nuclear and electric vehicles and the source of more than a quarter of projected growth in natural gas consumption. As demand growth in China slows, other
countries continue to push overall global demand higher – with India accounting for almost one-third of global growth to 2040.
The shale oil and gas revolution in the United States continues thanks to the remarkable ability of producers to unlock new resources in a cost-effective way. By the mid-
2020s, the United States is projected to become the world’s largest LNG exporter and a net oil exporter by the end of that decade.
This is having a major impact on oil and gas markets, challenging incumbent suppliers and provoking a major reorientation of global trade flows, with consumers in Asia
accounting for more than 70% of global oil and gas imports by 2040. LNG from the United States is also accelerating a major structural shift towards a more flexible and
globalized gas market.
WEO-2017 finds it is too early to write the obituary of oil. Global oil demand continues to grow to 2040, although at a steadily decreasing pace – while fuel efficiency and
rising electrification bring a peak in oil used for passenger cars, even with a doubling of the car fleet to two billion. But other sectors – namely petrochemicals, trucks,
aviation, and shipping – drive up oil demand to 105 million barrels a day by 2040. While carbon emissions have flattened in recent years, the report finds that global
energy-related CO2 emissions increase slightly by 2040, but at a slower pace than in last year’s projections. Still, this is far from enough to avoid severe impacts of climate
change.
Available at: <https://www.iea.org/newsroom/news/2017/ november/a-world-in-transformation-world-energyoutlook-2017.html>. Retrieved on: 14 Nov. 2017. Adapted.
Text II
BRASILIA – The International Energy Agency and Brazil jointly announced today that the country joined the IEA as an Association country, opening new avenues for
cooperation towards a more secure and sustainable energy future with Latin America’s largest country.
a) only Text I mentions a country that is well-known for its clean energy mix.
b) only Text II discusses what the global energy system will look like in the near future.
c) neither Text I nor Text II expresses concern with the future of oil production and demand in the next decades.
d) both Text I and Text II list all the IEA association countries and discuss how they can benefit from this cooperation.
e) both Text I and Text II mention the importance of renewable resources and clean energy technologies as a means of meeting energy demand.
www.tecconcursos.com.br/questoes/637639
What are the issues that will shape the global energy market in 2018? What will be the energy mix, trade patterns and price trends? Every country is different and local
factors, including politics, are important. But at the global level there are four key questions, and each of which answers is highly uncertain.
The first question is whether Saudi Arabia is stable. The kingdom’s oil exports now mostly go to Asia but the volumes involved mean that any volatility will destabilise a
market where speculation is rife.
The risk is that an open conflict, which Iran and Saudi have traditionally avoided despite all their differences, would spread and hit oil production and trade. It is worth
remembering that the Gulf states account for a quarter of global production and over 40 per cent of all the oil traded globally. The threat to stability is all the greater given
that Iran is likely to win any such clash and to treat the result as a licence to reassert its influence in the region.
The second question is how rapidly production of oil from shale rock will grow in the US — 2017 has seen an increase of 600,000 barrels a day to over 6m. The increase
in global prices over the past six months has made output from almost all America’s producing areas commercially viable and drilling activity is rising. A comparable
increase in 2018 would offset most of the current OPEC production cuts and either force another quota reduction or push prices down.
The third question concerns China. For the last three years the country has managed to deliver economic growth with only minimal ncreases in energy consumption.
Growth was probably lower than the claimed numbers — the Chinese do not like to admit that they, too, are subject to economic cycles and recessions — but even so the
achievement is considerable. The question is whether the trend can be continued. If it can, the result will limit global demand growth for oil, gas and coal.
China, which accounts for a quarter of the world’s daily energy use, is the swing consumer. If energy efficiency gains continue, CO2 emissions will remain flat or even fall.
The country’s economy is changing and moving away from heavy industry fuelled largely by coal to a more service-based one, with a more varied fuel mix. But the pace of
that shift is uncertain and some recent data suggests that as economic growth has picked up, so has consumption of oil and coal. Beijing has high ambitions for a much
cleaner energy economy, driven not least by the levels of air pollution in many of the major cities; 2018 will show how much progress they are making.
The fourth question is, if anything, the most important. How fast can renewables grow? The last few years have seen dramatic reductions in costs and strong increase in
supply. The industry has had a great year, with bids from offshore wind for capacity auctions in the UK and elsewhere at record low levels. Wind is approaching grid parity
— the moment when it can compete without subsidies. Solar is also thriving: according to the International Energy Agency, costs have fallen by 70 per cent since 2010 not
least because of advances in China, which now accounts for 60 per cent of total solar cell manufacturing capacity. The question is how rapidly all those gains can be
translated into electric supply.
Renewables, including hydro, accounted for just 5 per cent of global daily energy supply according to the IEA’s latest data. That is increasing — solar photovoltaic capacity
grew by 50 per cent in 2016 — but to make a real difference the industry needs a period of expansion comparable in scale to the growth of personal computing and mobile
phones in the 1990s and 2000s.
The problem is that the industry remains fragmented. Most renewable companies are small and local, and in many cases undercapitalised; some are built to collect
subsidies. A radical change will be necessary to make the industry global and capable of competing on the scale necessary to displace coal and natural gas. The coming
year will show us whether it is ready for that challenge.
In many ways, the energy business is at a moment of change and transition. Every reader will have their own view on each of the four questions. To me, the prospect is of
supply continuing to outpace demand. If that is right, the surge in oil prices over the past two months is a temporary and unsustainable phenomenon. It would take another
Middle East war to change the equation. Unfortunately, that is all too possible.
Available at: <https://www.ft.com/content/c9bdc750- ec85-11e7-8713-513b1d7ca85a>. Retrieved on: Feb 18, 2018. Adapted.
www.tecconcursos.com.br/questoes/637640
What are the issues that will shape the global energy market in 2018? What will be the energy mix, trade patterns and price trends? Every country is different and local
factors, including politics, are important. But at the global level there are four key questions, and each of which answers is highly uncertain.
The first question is whether Saudi Arabia is stable. The kingdom’s oil exports now mostly go to Asia but the volumes involved mean that any volatility will destabilise a
market where speculation is rife.
The risk is that an open conflict, which Iran and Saudi have traditionally avoided despite all their differences, would spread and hit oil production and trade. It is worth
remembering that the Gulf states account for a quarter of global production and over 40 per cent of all the oil traded globally. The threat to stability is all the greater given
that Iran is likely to win any such clash and to treat the result as a licence to reassert its influence in the region.
The second question is how rapidly production of oil from shale rock will grow in the US — 2017 has seen an increase of 600,000 barrels a day to over 6m. The increase
in global prices over the past six months has made output from almost all America’s producing areas commercially viable and drilling activity is rising. A comparable
increase in 2018 would offset most of the current OPEC production cuts and either force another quota reduction or push prices down.
The third question concerns China. For the last three years the country has managed to deliver economic growth with only minimal increases in energy consumption.
Growth was probably lower than the claimed numbers — the Chinese do not like to admit that they, too, are subject to economic cycles and recessions — but even so the
achievement is considerable. The question is whether the trend can be continued. If it can, the result will limit global demand growth for oil, gas and coal.
China, which accounts for a quarter of the world’s daily energy use, is the swing consumer. If energy efficiency gains continue, CO2 emissions will remain flat or even fall.
The country’s economy is changing and moving away from heavy industry fuelled largely by coal to a more service-based one, with a more varied fuel mix. But the pace of
that shift is uncertain and some recent data suggests that as economic growth has picked up, so has consumption of oil and coal. Beijing has high ambitions for a much
cleaner energy economy, driven not least by the levels of air pollution in many of the major cities; 2018 will show how much progress they are making.
The fourth question is, if anything, the most important. How fast can renewables grow? The last few years have seen dramatic reductions in costs and strong increase in
supply. The industry has had a great year, with bids from offshore wind for capacity auctions in the UK and elsewhere at record low levels. Wind is approaching grid parity
— the moment when it can compete without subsidies. Solar is also thriving: according to the International Energy Agency, costs have fallen by 70 per cent since 2010 not
least because of advances in China, which now accounts for 60 per cent of total solar cell manufacturing capacity. The question is how rapidly all those gains can be
translated into electric supply.
Renewables, including hydro, accounted for just 5 per cent of global daily energy supply according to the IEA’s latest data. That is increasing — solar photovoltaic capacity
grew by 50 per cent in 2016 — but to make a real difference the industry needs a period of expansion comparable in scale to the growth of personal computing and mobile
phones in the 1990s and 2000s.
The problem is that the industry remains fragmented. Most renewable companies are small and local, and in many cases undercapitalised; some are built to collect
subsidies. A radical change will be necessary to make the industry global and capable of competing on the scale necessary to displace coal and natural gas. The coming
year will show us whether it is ready for that challenge.
In many ways, the energy business is at a moment of change and transition. Every reader will have their own view on each of the four questions. To me, the prospect is of
supply continuing to outpace demand. If that is right, the surge in oil prices over the past two months is a temporary and unsustainable phenomenon. It would take another
Middle East war to change the equation. Unfortunately, that is all too possible.
Available at: <https://www.ft.com/content/c9bdc750- ec85-11e7-8713-513b1d7ca85a>. Retrieved on: Feb 18, 2018. Adapted.
Saudi Arabia and Iran are mentioned in paragraphs 2 and 3 (lines 8-20) because they
a) are latent enemies about to engage in violent strife.
b) produce more than 40 per cent of the world’s crude oil.
c) should spread their influence over the other Gulf States.
d) can be considered the most stable countries in the Middle East.
e) might affect oil production and trade if they engage in an open conflict.
www.tecconcursos.com.br/questoes/637645
What are the issues that will shape the global energy market in 2018? What will be the energy mix, trade patterns and price trends? Every country is different and local
factors, including politics, are important. But at the global level there are four key questions, and each of which answers is highly uncertain.
The first question is whether Saudi Arabia is stable. The kingdom’s oil exports now mostly go to Asia but the volumes involved mean that any volatility will destabilise a
market where speculation is rife.
The risk is that an open conflict, which Iran and Saudi have traditionally avoided despite all their differences, would spread and hit oil production and trade. It is worth
The second question is how rapidly production of oil from shale rock will grow in the US — 2017 has seen an increase of 600,000 barrels a day to over 6m. The increase
in global prices over the past six months has made output from almost all America’s producing areas commercially viable and drilling activity is rising. A comparable
increase in 2018 would offset most of the current OPEC production cuts and either force another quota reduction or push prices down.
The third question concerns China. For the last three years the country has managed to deliver economic growth with only minimal increases in energy consumption.
Growth was probably lower than the claimed numbers — the Chinese do not like to admit that they, too, are subject to economic cycles and recessions — but even so the
achievement is considerable. The question is whether the trend can be continued. If it can, the result will limit global demand growth for oil, gas and coal.
China, which accounts for a quarter of the world’s daily energy use, is the swing consumer. If energy efficiency gains continue, CO2 emissions will remain flat or even fall.
The country’s economy is changing and moving away from heavy industry fuelled largely by coal to a more service-based one, with a more varied fuel mix. But the pace of
that shift is uncertain and some recent data suggests that as economic growth has picked up, so has consumption of oil and coal. Beijing has high ambitions for a much
cleaner energy economy, driven not least by the levels of air pollution in many of the major cities; 2018 will show how much progress they are making.
The fourth question is, if anything, the most important. How fast can renewables grow? The last few years have seen dramatic reductions in costs and strong increase in
supply. The industry has had a great year, with bids from offshore wind for capacity auctions in the UK and elsewhere at record low levels. Wind is approaching grid parity
— the moment when it can compete without subsidies. Solar is also thriving: according to the International Energy Agency, costs have fallen by 70 per cent since 2010 not
least because of advances in China, which now accounts for 60 per cent of total solar cell manufacturing capacity. The question is how rapidly all those gains can be
translated into electric supply.
Renewables, including hydro, accounted for just 5 per cent of global daily energy supply according to the IEA’s latest data. That is increasing — solar photovoltaic capacity
grew by 50 per cent in 2016 — but to make a real difference the industry needs a period of expansion comparable in scale to the growth of personal computing and mobile
phones in the 1990s and 2000s.
The problem is that the industry remains fragmented. Most renewable companies are small and local, and in many cases undercapitalised; some are built to collect
subsidies. A radical change will be necessary to make the industry global and capable of competing on the scale necessary to displace coal and natural gas. The coming
year will show us whether it is ready for that challenge.
In many ways, the energy business is at a moment of change and transition. Every reader will have their own view on each of the four questions. To me, the prospect is of
supply continuing to outpace demand. If that is right, the surge in oil prices over the past two months is a temporary and unsustainable phenomenon. It would take another
Middle East war to change the equation. Unfortunately, that is all too possible.
Available at: <https://www.ft.com/content/c9bdc750- ec85-11e7-8713-513b1d7ca85a>. Retrieved on: Feb 18, 2018. Adapted.
The production of oil from shale rock in the US is mentioned in paragraph 4 (lines 21-29) because in 2018 it
a) can rapidly achieve the record level of 6 million barrels a day.
b) will certainly reach higher levels than those announced in 2017.
c) will make output from America’s producing areas commercially viable in 2018.
d) might compensate for present OPEC production cuts and cause a decrease in oil prices.
e) is going to have devastating effects on the drilling activity in the country in the near future.
www.tecconcursos.com.br/questoes/637651
What are the issues that will shape the global energy market in 2018? What will be the energy mix, trade patterns and price trends? Every country is different and local
factors, including politics, are important. But at the global level there are four key questions, and each of which answers is highly uncertain.
The first question is whether Saudi Arabia is stable. The kingdom’s oil exports now mostly go to Asia but the volumes involved mean that any volatility will destabilise a
market where speculation is rife.
The risk is that an open conflict, which Iran and Saudi have traditionally avoided despite all their differences, would spread and hit oil production and trade. It is worth
remembering that the Gulf states account for a quarter of global production and over 40 per cent of all the oil traded globally. The threat to stability is all the greater given
that Iran is likely to win any such clash and to treat the result as a licence to reassert its influence in the region.
The second question is how rapidly production of oil from shale rock will grow in the US — 2017 has seen an increase of 600,000 barrels a day to over 6m. The increase
in global prices over the past six months has made output from almost all America’s producing areas commercially viable and drilling activity is rising. A comparable
increase in 2018 would offset most of the current OPEC production cuts and either force another quota reduction or push prices down.
The third question concerns China. For the last three years the country has managed to deliver economic growth with only minimal increases in energy consumption.
Growth was probably lower than the claimed numbers — the Chinese do not like to admit that they, too, are subject to economic cycles and recessions — but even so the
achievement is considerable. The question is whether the trend can be continued. If it can, the result will limit global demand growth for oil, gas and coal.
China, which accounts for a quarter of the world’s daily energy use, is the swing consumer. If energy efficiency gains continue, CO2 emissions will remain flat or even fall.
The country’s economy is changing and moving away from heavy industry fuelled largely by coal to a more service-based one, with a more varied fuel mix. But the pace of
that shift is uncertain and some recent data suggests that as economic growth has picked up, so has consumption of oil and coal. Beijing has high ambitions for a much
cleaner energy economy, driven not least by the levels of air pollution in many of the major cities; 2018 will show how much progress they are making.
The fourth question is, if anything, the most important. How fast can renewables grow? The last few years have seen dramatic reductions in costs and strong increase in
supply. The industry has had a great year, with bids from offshore wind for capacity auctions in the UK and elsewhere at record low levels. Wind is approaching grid parity
— the moment when it can compete without subsidies. Solar is also thriving: according to the International Energy Agency, costs have fallen by 70 per cent since 2010 not
least because of advances in China, which now accounts for 60 per cent of total solar cell manufacturing capacity. The question is how rapidly all those gains can be
translated into electric supply.
Renewables, including hydro, accounted for just 5 per cent of global daily energy supply according to the IEA’s latest data. That is increasing — solar photovoltaic capacity
grew by 50 per cent in 2016 — but to make a real difference the industry needs a period of expansion comparable in scale to the growth of personal computing and mobile
phones in the 1990s and 2000s.
The problem is that the industry remains fragmented. Most renewable companies are small and local, and in many cases undercapitalised; some are built to collect
subsidies. A radical change will be necessary to make the industry global and capable of competing on the scale necessary to displace coal and natural gas. The coming
year will show us whether it is ready for that challenge.
In many ways, the energy business is at a moment of change and transition. Every reader will have their own view on each of the four questions. To me, the prospect is of
supply continuing to outpace demand. If that is right, the surge in oil prices over the past two months is a temporary and unsustainable phenomenon. It would take another
Middle East war to change the equation. Unfortunately, that is all too possible.
Available at: <https://www.ft.com/content/c9bdc750- ec85-11e7-8713-513b1d7ca85a>. Retrieved on: Feb 18, 2018. Adapted.
a) “over 40 per cent" refers to the percentage of global oil produced by Iran and Saudi.
b) “70 per cent” refers to the percentage decrease in solar energy costs since 2010.
c) “60 per cent” refers to the total percentage of solar cells commercialized in China.
d) “5 per cent” refers to the percentage of global energy generated by hydroelectric plants.
e) “50 per cent” refers to the percentage decrease in solar photovoltaic capacity in 2016.
www.tecconcursos.com.br/questoes/637658
What are the issues that will shape the global energy market in 2018? What will be the energy mix, trade patterns and price trends? Every country is different and local
factors, including politics, are important. But at the global level there are four key questions, and each of which answers is highly uncertain.
The first question is whether Saudi Arabia is stable. The kingdom’s oil exports now mostly go to Asia but the volumes involved mean that any volatility will destabilise a
market where speculation is rife.
The risk is that an open conflict, which Iran and Saudi have traditionally avoided despite all their differences, would spread and hit oil production and trade. It is worth
remembering that the Gulf states account for a quarter of global production and over 40 per cent of all the oil traded globally. The threat to stability is all the greater given
that Iran is likely to win any such clash and to treat the result as a licence to reassert its influence in the region.
The second question is how rapidly production of oil from shale rock will grow in the US — 2017 has seen an increase of 600,000 barrels a day to over 6m. The increase
in global prices over the past six months has made output from almost all America’s producing areas commercially viable and drilling activity is rising. A comparable
increase in 2018 would offset most of the current OPEC production cuts and either force another quota reduction or push prices down.
The third question concerns China. For the last three years the country has managed to deliver economic growth with only minimal increases in energy consumption.
Growth was probably lower than the claimed numbers — the Chinese do not like to admit that they, too, are subject to economic cycles and recessions — but even so the
achievement is considerable. The question is whether the trend can be continued. If it can, the result will limit global demand growth for oil, gas and coal.
China, which accounts for a quarter of the world’s daily energy use, is the swing consumer. If energy efficiency gains continue, CO2 emissions will remain flat or even fall.
The country’s economy is changing and moving away from heavy industry fuelled largely by coal to a more service-based one, with a more varied fuel mix. But the pace of
that shift is uncertain and some recent data suggests that as economic growth has picked up, so has consumption of oil and coal. Beijing has high ambitions for a much
cleaner energy economy, driven not least by the levels of air pollution in many of the major cities; 2018 will show how much progress they are making.
The fourth question is, if anything, the most important. How fast can renewables grow? The last few years have seen dramatic reductions in costs and strong increase in
supply. The industry has had a great year, with bids from offshore wind for capacity auctions in the UK and elsewhere at record low levels. Wind is approaching grid parity
— the moment when it can compete without subsidies. Solar is also thriving: according to the International Energy Agency, costs have fallen by 70 per cent since 2010 not
least because of advances in China, which now accounts for 60 per cent of total solar cell manufacturing capacity. The question is how rapidly all those gains can be
translated into electric supply.
Renewables, including hydro, accounted for just 5 per cent of global daily energy supply according to the IEA’s latest data. That is increasing — solar photovoltaic capacity
grew by 50 per cent in 2016 — but to make a real difference the industry needs a period of expansion comparable in scale to the growth of personal computing and mobile
phones in the 1990s and 2000s.
The problem is that the industry remains fragmented. Most renewable companies are small and local, and in many cases undercapitalised; some are built to collect
subsidies. A radical change will be necessary to make the industry global and capable of competing on the scale necessary to displace coal and natural gas. The coming
year will show us whether it is ready for that challenge.
In many ways, the energy business is at a moment of change and transition. Every reader will have their own view on each of the four questions. To me, the prospect is of
supply continuing to outpace demand. If that is right, the surge in oil prices over the past two months is a temporary and unsustainable phenomenon. It would take another
Middle East war to change the equation. Unfortunately, that is all too possible.
Available at: <https://www.ft.com/content/c9bdc750- ec85-11e7-8713-513b1d7ca85a>. Retrieved on: Feb 18, 2018. Adapted.
www.tecconcursos.com.br/questoes/637660
What are the issues that will shape the global energy market in 2018? What will be the energy mix, trade patterns and price trends? Every country is different and local
factors, including politics, are important. But at the global level there are four key questions, and each of which answers is highly uncertain.
The first question is whether Saudi Arabia is stable. The kingdom’s oil exports now mostly go to Asia but the volumes involved mean that any volatility will destabilise a
market where speculation is rife.
The risk is that an open conflict, which Iran and Saudi have traditionally avoided despite all their differences, would spread and hit oil production and trade. It is worth
remembering that the Gulf states account for a quarter of global production and over 40 per cent of all the oil traded globally. The threat to stability is all the greater given
that Iran is likely to win any such clash and to treat the result as a licence to reassert its influence in the region.
The second question is how rapidly production of oil from shale rock will grow in the US — 2017 has seen an increase of 600,000 barrels a day to over 6m. The increase
in global prices over the past six months has made output from almost all America’s producing areas commercially viable and drilling activity is rising. A comparable
increase in 2018 would offset most of the current OPEC production cuts and either force another quota reduction or push prices down.
The third question concerns China. For the last three years the country has managed to deliver economic growth with only minimal increases in energy consumption.
Growth was probably lower than the claimed numbers — the Chinese do not like to admit that they, too, are subject to economic cycles and recessions — but even so the
achievement is considerable. The question is whether the trend can be continued. If it can, the result will limit global demand growth for oil, gas and coal.
China, which accounts for a quarter of the world’s daily energy use, is the swing consumer. If energy efficiency gains continue, CO2 emissions will remain flat or even fall.
The country’s economy is changing and moving away from heavy industry fuelled largely by coal to a more service-based one, with a more varied fuel mix. But the pace of
that shift is uncertain and some recent data suggests that as economic growth has picked up, so has consumption of oil and coal. Beijing has high ambitions for a much
cleaner energy economy, driven not least by the levels of air pollution in many of the major cities; 2018 will show how much progress they are making.
The fourth question is, if anything, the most important. How fast can renewables grow? The last few years have seen dramatic reductions in costs and strong increase in
supply. The industry has had a great year, with bids from offshore wind for capacity auctions in the UK and elsewhere at record low levels. Wind is approaching grid parity
— the moment when it can compete without subsidies. Solar is also thriving: according to the International Energy Agency, costs have fallen by 70 per cent since 2010 not
least because of advances in China, which now accounts for 60 per cent of total solar cell manufacturing capacity. The question is how rapidly all those gains can be
translated into electric supply.
Renewables, including hydro, accounted for just 5 per cent of global daily energy supply according to the IEA’s latest data. That is increasing — solar photovoltaic capacity
grew by 50 per cent in 2016 — but to make a real difference the industry needs a period of expansion comparable in scale to the growth of personal computing and mobile
phones in the 1990s and 2000s.
The problem is that the industry remains fragmented. Most renewable companies are small and local, and in many cases undercapitalised; some are built to collect
subsidies. A radical change will be necessary to make the industry global and capable of competing on the scale necessary to displace coal and natural gas. The coming
year will show us whether it is ready for that challenge.
In many ways, the energy business is at a moment of change and transition. Every reader will have their own view on each of the four questions. To me, the prospect is of
supply continuing to outpace demand. If that is right, the surge in oil prices over the past two months is a temporary and unsustainable phenomenon. It would take another
Middle East war to change the equation. Unfortunately, that is all too possible.
Available at: <https://www.ft.com/content/c9bdc750- ec85-11e7-8713-513b1d7ca85a>. Retrieved on: Feb 18, 2018. Adapted.
www.tecconcursos.com.br/questoes/654435
Banks simplify people’s lives, but the business of banking is anything but simple. Every transaction — from cashing a check to taking out a loan — requires careful record
keeping. Behind the scenes in every bank or savings and loan association there are dozens of bank clerks, each an expert at keeping one area of he bank’s business
running smoothly.
New account clerks open and close accounts and answer questions for customers. Interest clerks record interest due to savings account customers, as well as the interest
owed to the bank on loans and other investments. Exchange clerks, who work on international accounts, translate foreign currency values into dollars and vice versa. Loan
clerks sort and record information about loans. Statement clerks are responsible for preparing the monthly balance sheets of checking account customers. Securities clerks
record, file, and maintain stocks, bonds, and other investment certificates. They also keep track of dividends and interest on these certificates.
Other clerks operate the business machines on which modern banks rely. Proof operators sort checks and record the amount of each check. Bookkeeping clerks keep
records of each customer’s account. In addition to these specialists, banks need general clerical help — data entry keyers, file clerks, mail handlers, and messengers —
just as any other business does.
Bank clerks usually need a high school education with an emphasis on basic skills in typing, bookkeeping, and business math. Knowledge of computers and business
machines is also helpful. Prospective bank workers may be tested on their clerical skills when they are interviewed. Most banks provide new employees with on-the-job
training.
Banks prefer to promote their employees rather than hire new workers for jobs that require experience. Clerks frequently become tellers or supervisors. Many banks
encourage their employees to further their education at night.
According to the U.S. Bureau of Labor Statistics, employment of bank clerks was expected to decline through the year 2014, because many banks are electronically
automating their systems and eliminating paperwork as well as many clerical tasks. Workers with knowledge of data processing and computers will have the best
opportunities. In addition to jobs created through expansion, openings at the clerical level often occur as workers move up to positions of greater responsibility.
Working Conditions
Although banks usually provide a pleasant working atmosphere, clerks often work alone, at times performing repetitive tasks. Bank clerks generally work between thirty-
five and forty hours per week, but they may be expected to take on evening and Saturday shifts depending on bank hours.
The salaries of bank clerks vary widely depending on the size and location of the bank and the clerk’s experience. According to the Bureau of Labor Statistics, median
salaries ranged from $23,317 to $27,310 per year in 2004 depending on experience and title. Generally, loan clerks are on the high end of this range, whereas general
office clerks are on the lower end.
Banks typically offer their employees excellent benefits. Besides paid vacations and more than the usual number of paid holidays, employees may receive health and life
insurance and participate in pension and profit-sharing plans. Some banks provide financial aid so that workers can continue their education.
e) ask for changes in the way bank recruiters select their future employees.
www.tecconcursos.com.br/questoes/654439
Banks simplify people’s lives, but the business of banking is anything but simple. Every transaction — from cashing a check to taking out a loan — requires careful record
keeping. Behind the scenes in every bank or savings and loan association there are dozens of bank clerks, each an expert at keeping one area of he bank’s business
running smoothly.
New account clerks open and close accounts and answer questions for customers. Interest clerks record interest due to savings account customers, as well as the interest
owed to the bank on loans and other investments. Exchange clerks, who work on international accounts, translate foreign currency values into dollars and vice versa. Loan
clerks sort and record information about loans. Statement clerks are responsible for preparing the monthly balance sheets of checking account customers. Securities clerks
record, file, and maintain stocks, bonds, and other investment certificates. They also keep track of dividends and interest on these certificates.
Other clerks operate the business machines on which modern banks rely. Proof operators sort checks and record the amount of each check. Bookkeeping clerks keep
records of each customer’s account. In addition to these specialists, banks need general clerical help — data entry keyers, file clerks, mail handlers, and messengers —
just as any other business does.
Bank clerks usually need a high school education with an emphasis on basic skills in typing, bookkeeping, and business math. Knowledge of computers and business
machines is also helpful. Prospective bank workers may be tested on their clerical skills when they are interviewed. Most banks provide new employees with on-the-job
training.
Sometimes bank recruiters visit high schools to look for future employees. High school placement offices can tell students whether this is the practice at their school. If not,
prospective bank workers can apply directly to local banks through their personnel departments. Bank jobs may be listed with state and private employment
agencies. Candidates can also check Internet job sites and the classified ads in local newspapers as well.
Banks prefer to promote their employees rather than hire new workers for jobs that require experience. Clerks frequently become tellers or supervisors. Many banks
encourage their employees to further their education at night.
Working Conditions
Although banks usually provide a pleasant working atmosphere, clerks often work alone, at times performing repetitive tasks. Bank clerks generally work between thirty-
five and forty hours per week, but they may be expected to take on evening and Saturday shifts depending on bank hours.
The salaries of bank clerks vary widely depending on the size and location of the bank and the clerk’s experience. According to the Bureau of Labor Statistics, median
salaries ranged from $23,317 to $27,310 per year in 2004 depending on experience and title. Generally, loan clerks are on the high end of this range, whereas general
office clerks are on the lower end.
Banks typically offer their employees excellent benefits. Besides paid vacations and more than the usual number of paid holidays, employees may receive health and life
insurance and participate in pension and profit-sharing plans. Some banks provide financial aid so that workers can continue their education.
The fragment “Banks simplify people’s lives, but the business of banking is anything but simple” means that banking is a(n)
a) ordinary occupation
b) elementary job
c) complex activity
d) trivial profession
e) easy business
www.tecconcursos.com.br/questoes/694022
Governments need to give technical experts more autonomy and hold their nerve to provide more long-term stability when investing in clean energy, argue researchers in
climate change and innovation policy in a new paper published today.
Writing in the journal Nature, the authors from UK and US institutions have set out guidelines for investment based on an analysis of the last twenty years of “what works”
in clean energy research and innovation programs.
Their six simple “guiding principles” also include the need to channel innovation into the private sector through formal tech transfer programs, and to think in terms of
lasting knowledge creation rather than ‘quick win’ potential when funding new projects.
The authors offer a stark warning to governments and policymakers: learn from and build on experience before time runs out, rather than constantly reinventing aims and
processes for the sake of political vanity.
“As the window of opportunity to avert dangerous climate change narrows, we urgently need to take stock of policy initiatives around the world that aim to accelerate new
energy technologies and stem greenhouse gas emissions,” said Laura Diaz Anadon, Professor of Climate Change Policy at the University of Cambridge.
“If we don’t build on the lessons from previous policy successes and failures to understand what works and why, we risk wasting time and money in a way that we simply
can’t afford,” said Anadon, who authored the new paper with colleagues from the Harvard Kennedy School as well as the University of Minnesota’s Prof Gabriel Chan.
Public investments in energy research have risen since the lows of the mid-1990s and early 2000s. OECD members spent US$16.6 billion on new energy research and
development (R&D) in 2016 compared to $10b in 2010. The EU and other nations pledged to double clean energy investment as part of 2015’s Paris Climate Change
Agreement.
Recently, the UK government set out its own Clean Growth Strategy, committing £2.5 billion between 2015 and 2021, with hundreds of million to be invested in new
generations of small nuclear power stations and offshore wind turbines.
However, Anadon and colleagues point out that government funding for energy innovation has, in many cases, been highly volatile in the recent past: with political shifts
resulting in huge budget fluctuations and process reinventions in the UK and US.
For example, the research team found that every single year between 1990 and 2017, one in five technology areas funded by the US Department of Energy (DoE) saw a
budget shift of more than 30% up or down. The Trump administration’s current plan is to slash 2018’s energy R&D budget by 35% across the board.
“Experimentation has benefits, but also costs,” said Anadon. “Researchers are having to relearn new processes, people and programmes with every political transition --
wasting time and effort for scientists, companies and policymakers.”
“Rather than repeated overhauls, existing programs should be continuously evaluated and updated. New programs should only be set up if they fill needs not currently
met.”
More autonomy for project selection should be passed to active scientists, who are “best placed to spot bold but risky opportunities that managers miss,” say the authors of
the new paper.
They point to projects instigated by the US National Labs producing more commercially-viable technologies than those dictated by DoE headquarters — despite the Labs
holding a mere 4% of the DoE’s overall budget.
The six evidence-based guiding principles for clean energy investment are:
Give researchers and technical experts more autonomy and influence over funding decisions.
Build technology transfer into research organisations.
Focus demonstration projects on learning.
Incentivise international collaboration.
Adopt an adaptive learning strategy.
Keep funding stable and predictable.
From US researchers using the pace of Chinese construction markets to test energy reduction technologies, to the UK government harnessing behavioural psychology to
promote energy efficiency, the authors highlight examples of government investment that helped create or improve clean energy initiatives across the world.
“Let’s learn from experience on how to accelerate the transition to a cleaner, safer and more affordable energy system,” they write.
Available at: <http://www.sciencedaily. com releases/2017/12/171206132223.htm>. Retrieved on: 28 Dec 2017. Adapted.
www.tecconcursos.com.br/questoes/694024
Governments need to give technical experts more autonomy and hold their nerve to provide more long-term stability when investing in clean energy, argue researchers in
climate change and innovation policy in a new paper published today.
Writing in the journal Nature, the authors from UK and US institutions have set out guidelines for investment based on an analysis of the last twenty years of “what works”
in clean energy research and innovation programs.
Their six simple “guiding principles” also include the need to channel innovation into the private sector through formal tech transfer programs, and to think in terms of
lasting knowledge creation rather than ‘quick win’ potential when funding new projects.
The authors offer a stark warning to governments and policymakers: learn from and build on experience before time runs out, rather than constantly reinventing aims and
processes for the sake of political vanity.
“As the window of opportunity to avert dangerous climate change narrows, we urgently need to take stock of policy initiatives around the world that aim to accelerate new
energy technologies and stem greenhouse gas emissions,” said Laura Diaz Anadon, Professor of Climate Change Policy at the University of Cambridge.
“If we don’t build on the lessons from previous policy successes and failures to understand what works and why, we risk wasting time and money in a way that we simply
can’t afford,” said Anadon, who authored the new paper with colleagues from the Harvard Kennedy School as well as the University of Minnesota’s Prof Gabriel Chan.
Public investments in energy research have risen since the lows of the mid-1990s and early 2000s. OECD members spent US$16.6 billion on new energy research and
development (R&D) in 2016 compared to $10b in 2010. The EU and other nations pledged to double clean energy investment as part of 2015’s Paris Climate Change
Agreement.
Recently, the UK government set out its own Clean Growth Strategy, committing £2.5 billion between 2015 and 2021, with hundreds of million to be invested in new
generations of small nuclear power stations and offshore wind turbines.
However, Anadon and colleagues point out that government funding for energy innovation has, in many cases, been highly volatile in the recent past: with political shifts
resulting in huge budget fluctuations and process reinventions in the UK and US.
For example, the research team found that every single year between 1990 and 2017, one in five technology areas funded by the US Department of Energy (DoE) saw a
budget shift of more than 30% up or down. The Trump administration’s current plan is to slash 2018’s energy R&D budget by 35% across the board.
“Experimentation has benefits, but also costs,” said Anadon. “Researchers are having to relearn new processes, people and programmes with every political transition --
wasting time and effort for scientists, companies and policymakers.”
“Rather than repeated overhauls, existing programs should be continuously evaluated and updated. New programs should only be set up if they fill needs not currently
met.”
More autonomy for project selection should be passed to active scientists, who are “best placed to spot bold but risky opportunities that managers miss,” say the authors of
the new paper.
They point to projects instigated by the US National Labs producing more commercially-viable technologies than those dictated by DoE headquarters — despite the Labs
holding a mere 4% of the DoE’s overall budget.
The six evidence-based guiding principles for clean energy investment are:
Give researchers and technical experts more autonomy and influence over funding decisions.
Build technology transfer into research organisations.
Focus demonstration projects on learning.
Incentivise international collaboration.
Adopt an adaptive learning strategy.
Keep funding stable and predictable.
From US researchers using the pace of Chinese construction markets to test energy reduction technologies, to the UK government harnessing behavioural psychology to
promote energy efficiency, the authors highlight examples of government investment that helped create or improve clean energy initiatives across the world.
“Let’s learn from experience on how to accelerate the transition to a cleaner, safer and more affordable energy system,” they write.
Available at: <http://www.sciencedaily. com releases/2017/12/171206132223.htm>. Retrieved on: 28 Dec 2017. Adapted.
www.tecconcursos.com.br/questoes/694026
Governments need to give technical experts more autonomy and hold their nerve to provide more long-term stability when investing in clean energy, argue researchers in
climate change and innovation policy in a new paper published today.
Writing in the journal Nature, the authors from UK and US institutions have set out guidelines for investment based on an analysis of the last twenty years of “what works”
in clean energy research and innovation programs.
Their six simple “guiding principles” also include the need to channel innovation into the private sector through formal tech transfer programs, and to think in terms of
lasting knowledge creation rather than ‘quick win’ potential when funding new projects.
The authors offer a stark warning to governments and policymakers: learn from and build on experience before time runs out, rather than constantly reinventing aims and
processes for the sake of political vanity.
“As the window of opportunity to avert dangerous climate change narrows, we urgently need to take stock of policy initiatives around the world that aim to accelerate new
energy technologies and stem greenhouse gas emissions,” said Laura Diaz Anadon, Professor of Climate Change Policy at the University of Cambridge.
“If we don’t build on the lessons from previous policy successes and failures to understand what works and why, we risk wasting time and money in a way that we simply
can’t afford,” said Anadon, who authored the new paper with colleagues from the Harvard Kennedy School as well as the University of Minnesota’s Prof Gabriel Chan.
Public investments in energy research have risen since the lows of the mid-1990s and early 2000s. OECD members spent US$16.6 billion on new energy research and
development (R&D) in 2016 compared to $10b in 2010. The EU and other nations pledged to double clean energy investment as part of 2015’s Paris Climate Change
Agreement.
Recently, the UK government set out its own Clean Growth Strategy, committing £2.5 billion between 2015 and 2021, with hundreds of million to be invested in new
generations of small nuclear power stations and offshore wind turbines.
However, Anadon and colleagues point out that government funding for energy innovation has, in many cases, been highly volatile in the recent past: with political shifts
resulting in huge budget fluctuations and process reinventions in the UK and US.
For example, the research team found that every single year between 1990 and 2017, one in five technology areas funded by the US Department of Energy (DoE) saw a
budget shift of more than 30% up or down. The Trump administration’s current plan is to slash 2018’s energy R&D budget by 35% across the board.
“Experimentation has benefits, but also costs,” said Anadon. “Researchers are having to relearn new processes, people and programmes with every political transition --
wasting time and effort for scientists, companies and policymakers.”
“Rather than repeated overhauls, existing programs should be continuously evaluated and updated. New programs should only be set up if they fill needs not currently
met.”
More autonomy for project selection should be passed to active scientists, who are “best placed to spot bold but risky opportunities that managers miss,” say the authors of
the new paper.
They point to projects instigated by the US National Labs producing more commercially-viable technologies than those dictated by DoE headquarters — despite the Labs
holding a mere 4% of the DoE’s overall budget.
The six evidence-based guiding principles for clean energy investment are:
Give researchers and technical experts more autonomy and influence over funding decisions.
Build technology transfer into research organisations.
Focus demonstration projects on learning.
Incentivise international collaboration.
Adopt an adaptive learning strategy.
Keep funding stable and predictable.
From US researchers using the pace of Chinese construction markets to test energy reduction technologies, to the UK government harnessing behavioural psychology to
promote energy efficiency, the authors highlight examples of government investment that helped create or improve clean energy initiatives across the world.
“Let’s learn from experience on how to accelerate the transition to a cleaner, safer and more affordable energy system,” they write.
Available at: <http://www.sciencedaily. com releases/2017/12/171206132223.htm>. Retrieved on: 28 Dec 2017. Adapted.
According to, one of the guiding principles for clean energy investment is
www.tecconcursos.com.br/questoes/694033
It’s no secret that the global energy demand continues to rise. Driven by emerging economies and non-OECD nations, total worldwide energy usage is expected to grow by
nearly 40% over the next 20 years. That’ll require a staggering amount of coal, oil and gas.
But it’s not just fossil fuels that will get the nod. The demand for renewable energy sources is exploding, and according to new study, we haven’t seen anything yet in
terms of spending on solar, wind and other green energy projects. For investors, that spending could lead to some serious portfolio green as well.
The future is certainly looking pretty “green” for renewable energy bulls. A new study shows that the sector will receive nearly $5.1 trillion worth of investment in new
power plants by 2030. According to a new report by Bloomberg New Energy Finance, by 2030, renewable energy sources will account for over 60% of the 5,579 gigawatts
of new generation capacity and 65% of the $7.7 trillion in power investment. Overall, fossil fuels, such as coal and natural gas, will see their total share of power
generation fall to 46%. That’s a lot, but down from roughly from 64% today.
Large-scale hydropower facilities will command the lion’s share of new capacity among green energy sources. However, the expansion by solar and wind energy will be
mighty swift as well.
The Bloomberg report shows that solar and wind will increase their combined share of global generation capacity to 16% from 3% by 2030. The key driver will be utility-
scale solar power plants, as well as the vast adoption of rooftop solar arrays in emerging markets lacking modern grid infrastructure. In places like Latin America and
India, the lack of infrastructure will actually make rooftop solar a cheaper option for electricity generation. Analysts estimate that Latin America will add nearly 102 GW
worth of rooftop solar arrays during the study’s time period.
Bloomberg New Energy predicts that economics will have more to do with the additional generation capacity than subsidies. The same can be said for many Asian nations.
Increased solar adoption will benefit from higher costs related to rising liquid natural gas (LNG) imports in the region starting in 2024. Likewise, on- and offshore wind
power facilities will see rising capacity as well.
In the developed world, Bloomberg New Energy Finance predicts that CO2 and emission reductions will also help play a major role in adding additional renewable energy
to the grid. While the U.S. will still focus much of its attention towards shale gas, developed Europe will spend roughly $67 billion on new green energy capacity by 2030.
www.tecconcursos.com.br/questoes/694035
It’s no secret that the global energy demand continues to rise. Driven by emerging economies and non-OECD nations, total worldwide energy usage is expected to grow by
nearly 40% over the next 20 years. That’ll require a staggering amount of coal, oil and gas.
But it’s not just fossil fuels that will get the nod. The demand for renewable energy sources is exploding, and according to new study, we haven’t seen anything yet in
terms of spending on solar, wind and other green energy projects. For investors, that spending could lead to some serious portfolio green as well.
The future is certainly looking pretty “green” for renewable energy bulls. A new study shows that the sector will receive nearly $5.1 trillion worth of investment in new
power plants by 2030. According to a new report by Bloomberg New Energy Finance, by 2030, renewable energy sources will account for over 60% of the 5,579 gigawatts
of new generation capacity and 65% of the $7.7 trillion in power investment. Overall, fossil fuels, such as coal and natural gas, will see their total share of power
generation fall to 46%. That’s a lot, but down from roughly from 64% today.
Large-scale hydropower facilities will command the lion’s share of new capacity among green energy sources. However, the expansion by solar and wind energy will be
mighty swift as well.
The Bloomberg report shows that solar and wind will increase their combined share of global generation capacity to 16% from 3% by 2030. The key driver will be utility-
scale solar power plants, as well as the vast adoption of rooftop solar arrays in emerging markets lacking modern grid infrastructure. In places like Latin America and
India, the lack of infrastructure will actually make rooftop solar a cheaper option for electricity generation. Analysts estimate that Latin America will add nearly 102 GW
worth of rooftop solar arrays during the study’s time period.
Bloomberg New Energy predicts that economics will have more to do with the additional generation capacity than subsidies. The same can be said for many Asian nations.
Increased solar adoption will benefit from higher costs related to rising liquid natural gas (LNG) imports in the region starting in 2024. Likewise, on- and offshore wind
power facilities will see rising capacity as well.
In the developed world, Bloomberg New Energy Finance predicts that CO2 and emission reductions will also help play a major role in adding additional renewable energy
to the grid. While the U.S. will still focus much of its attention towards shale gas, developed Europe will spend roughly $67 billion on new green energy capacity by 2030.
In Text, the author affirms that “The future is certainly looking pretty green for renewable energy bulls” because of the
a) large share of electricity to be generated from renewable energy sources by 2030.
b) expected growth in fossil fuels in the total share of power generation by 2030.
c) dominant position of coal and natural gas for electricity generation nowadays.
d) global boom in hydropower generation by the end of this decade.
e) massive investment in solar and wind energy in the next decade
www.tecconcursos.com.br/questoes/694036
Clean energy: Experts outline how governments can successfully invest before it’s too late
Governments need to give technical experts more autonomy and hold their nerve to provide more long-term stability when investing in clean energy, argue researchers in
climate change and innovation policy in a new paper published today.
Writing in the journal Nature, the authors from UK and US institutions have set out guidelines for investment based on an analysis of the last twenty years of “what works”
in clean energy research and innovation programs.
Their six simple “guiding principles” also include the need to channel innovation into the private sector through formal tech transfer programs, and to think in terms of
lasting knowledge creation rather than ‘quick win’ potential when funding new projects.
The authors offer a stark warning to governments and policymakers: learn from and build on experience before time runs out, rather than constantly reinventing aims and
processes for the sake of political vanity. “As the window of opportunity to avert dangerous climate change narrows, we urgently need to take stock of policy initiatives
around the world that aim to accelerate new energy technologies and stem greenhouse gas emissions,” said Laura Diaz Anadon, Professor of Climate Change Policy at the
University of Cambridge.
“If we don’t build on the lessons from previous policy successes and failures to understand what works and why, we risk wasting time and money in a way that we simply
can’t afford,” said Anadon, who authored the new paper with colleagues from the Harvard Kennedy School as well as the University of Minnesota’s Prof Gabriel Chan.
Public investments in energy research have risen since the lows of the mid-1990s and early 2000s. OECD members spent US$16.6 billion on new energy research and
development (R&D) in 2016 compared to $10b in 2010. The EU and other nations pledged to double clean energy investment as part of 2015’s Paris Climate Change
Agreement.
Recently, the UK government set out its own Clean Growth Strategy, committing £2.5 billion between 2015 and 2021, with hundreds of million to be invested in new
generations of small nuclear power stations and offshore wind turbines. However, Anadon and colleagues point out that government funding for energy innovation has, in
many cases, been highly volatile in the recent past: with political shifts resulting in huge budget fluctuations and process reinventions in the UK and US.
For example, the research team found that every single year between 1990 and 2017, one in five technology areas funded by the US Department of Energy (DoE) saw a
budget shift of more than 30% up or down. The Trump administration’s current plan is to slash 2018’s energy R&D budget by 35% across the board.
“Experimentation has benefits, but also costs,” said Anadon. “Researchers are having to relearn new processes, people and programmes with every political transition --
wasting time and effort for scientists, companies and policymakers.”
“Rather than repeated overhauls, existing programs should be continuously evaluated and updated. New programs should only be set up if they fill needs not currently
met.”
More autonomy for project selection should be passed to active scientists, who are “best placed to spot bold but risky opportunities that managers miss,” say the authors of
the new paper.
They point to projects instigated by the US National Labs producing more commercially-viable technologies than those dictated by DoE headquarters — despite the Labs
holding a mere 4% of the DoE’s overall budget.
The six evidence-based guiding principles for clean energy investment are:
• Give researchers and technical experts more autonomy and influence over funding decisions.
• Build technology transfer into research organisations.
• Focus demonstration projects on learning.
• Incentivise international collaboration.
• Adopt an adaptive learning strategy.
• Keep funding stable and predictable.
From US researchers using the pace of Chinese construction markets to test energy reduction technologies, to the UK government harnessing behavioural psychology to
promote energy efficiency, the authors highlight examples of government investment that helped create or improve clean energy initiatives across the world.
“Let’s learn from experience on how to accelerate the transition to a cleaner, safer and more affordable energy system,” they write.
It’s no secret that the global energy demand continues to rise. Driven by emerging economies and non-OECD nations, total worldwide energy usage is expected to grow by
nearly 40% over the next 20 years. That’ll require a staggering amount of coal, oil and gas.
But it’s not just fossil fuels that will get the nod. The demand for renewable energy sources is exploding, and according to new study, we haven’t seen anything yet in
terms of spending on solar, wind and other green energy projects. For investors, that spending could lead to some serious portfolio green as well.
The future is certainly looking pretty “green” for renewable energy bulls. A new study shows that the sector will receive nearly $5.1 trillion worth of investment in new
power plants by 2030. According to a new report by Bloomberg New Energy Finance, by 2030, renewable energy sources will account for over 60% of the 5,579 gigawatts
of new generation capacity and 65% of the $7.7 trillion in power investment. Overall, fossil fuels, such as coal and natural gas, will see their total share of power
generation fall to 46%. That’s a lot, but down from roughly from 64% today.
Large-scale hydropower facilities will command the lion’s share of new capacity among green energy sources. However, the expansion by solar and wind energy will be
mighty swift as well.
The Bloomberg report shows that solar and wind will increase their combined share of global generation capacity to 16% from 3% by 2030. The key driver will be utility-
scale solar power plants, as well as the vast adoption of rooftop solar arrays in emerging markets lacking modern grid infrastructure. In places like Latin America and
India, the lack of infrastructure will actually make rooftop solar a cheaper option for electricity generation. Analysts estimate that Latin America will add nearly 102 GW
worth of rooftop solar arrays during the study’s time period.
Bloomberg New Energy predicts that economics will have more to do with the additional generation capacity than subsidies. The same can be said for many Asian nations.
Increased solar adoption will benefit from higher costs related to rising liquid natural gas (LNG) imports in the region starting in 2024. Likewise, on- and offshore wind
power facilities will see rising capacity as well.
In the developed world, Bloomberg New Energy Finance predicts that CO2 and emission reductions will also help play a major role in adding additional renewable energy
to the grid. While the U.S. will still focus much of its attention towards shale gas, developed Europe will spend roughly $67 billion on new green energy capacity by 2030.
www.tecconcursos.com.br/questoes/565178
Oil
Overview
The oil industry has a less-than-stellar environmental record in general, but it becomes even worse in tropical rainforest regions, which often contain rich deposits of
petroleum. The most notorious examples of rainforest havoc caused by oil firms are Shell Oil in Nigeria and Texaco in Ecuador. The operations run by both companies
degraded the environment and affected local and indigenous people by their activities. The Texaco operation in Ecuador was responsible for spilling some 17 million gallons
of oil into the biologically rich tributaries of the upper Amazon, while in the 1980s and 1990s Shell Oil cooperated with the oppressive military dictatorship in Nigeria in the
suppression and harassment of local people.
Action
The simplest and most reliable way to mitigate damage from oil operations would be to prohibit oil extraction in the tropical rainforest. But that is unlikely given the
number of tropical countries that produce oil and the wealth of oil deposits located in forest areas. Thus the focus is on reducing pollution and avoiding spills through better
pipeline management, reinjection techniques, and halting methane flaring. Limiting road development and restricting access can help avoid deforestation associated with
settlement.
Biofuels
The energy and technology sectors are investing heavily in alternatives to conventional fossil fuels, but early efforts to use crop-based biofuels have had serious
environmental consequences.
While some believed biofuels—fuels that are derived from biomass, including recently living organisms like plants or their metabolic byproducts like cow manure— would
offer environmental benefits over conventional fossils fuels, the production and use of biofuels derived from palm oil, soy, corn, rapeseed, and sugar cane have in recent
years driven up food prices, promoted large-scale deforestation, depleted water supplies, worsened soil erosion, and lead to increased air and water pollution. Still, there
is hope that the next generation of biofuels, derived from farm waste, algae, and native grasses and weeds, could eliminate many of the worse effects seen during the
current rush into biofuels.
Efficiency
Good old-fashioned oil conservation is effective in reducing demand for oil products. After the first OPEC embargo in 1973, the United States realized the importance of oil
efficiency and initiated policies to do away with wasteful practices. By 1985, the U.S. was 25 percent more energy efficient and 32 percent more oil efficient than in 1973.
Of course the U.S. was upstaged by the Japanese who in the same period improved their energy efficiency by 31 percent and their oil efficiency by 51 percent. Today the
importance of oil to the economy is still diminishing. Despite the 51 percent growth in the American economy between 1990 and 2004, carbon emissions only increased
19% suggesting that those who insist that economic growth and carbon dioxide emissions move in tandem are wrong.
The developed world can seek alternative methods to oil exploration, by developing new technologies that rely less on processes that are ecologically damaging. For
example, compressed natural gas is a cleaner-burning fuel than gasoline, is already used in some cars, and is available in vast quantities. Electric cars are potentially even
more environmentally sound.
To encourage investment in research and development of “greener” technologies, governments can help by eliminating subsidies for the oil and gas industry and imposing
higher taxes on heavy polluters. While governments will play a role in cleaner-energy development, it is likely that the private sector will provide most of the funding and
innovation for new energy projects. Venture capital firms and corporations have put billions into new technologies since the mid-2000s, while corporations are getting on
board as well.
As experiences with biofuels have shown, there are often downsides to alternative energy sources. For example, hydroelectric projects have destroyed river systems and
flooded vast areas of forests. Thus when undertaking any large-scale energy project — whether it’s wind, solar, tidal, geothermal, or something else — it is important to
conduct a proper assessment of its impact.
Conclusion
Admittedly, there are many challenges facing sustainable use of tropical rainforests. In arriving at a solution many issues must be addressed, including the resolution of
conflicting claims to land considered to be in the public domain; barriers to markets; the assurance of sustainable development without overexploitation in the face of
Almost none of these economic possibilities can become realities if the rainforests are completely stripped. Useful products cannot be harvested from species that no
longer exist, just as eco-tourists will not visit the vast stretches of wasteland that were once lush forest. Thus some of the primary rainforests must be salvaged for
sustainable development to be at all successful.
www.tecconcursos.com.br/questoes/565189
Oil
Overview
The oil industry has a less-than-stellar environmental record in general, but it becomes even worse in tropical rainforest regions, which often contain rich deposits of
petroleum. The most notorious examples of rainforest havoc caused by oil firms are Shell Oil in Nigeria and Texaco in Ecuador. The operations run by both companies
degraded the environment and affected local and indigenous people by their activities. The Texaco operation in Ecuador was responsible for spilling some 17 million gallons
of oil into the biologically rich tributaries of the upper Amazon, while in the 1980s and 1990s Shell Oil cooperated with the oppressive military dictatorship in Nigeria in the
suppression and harassment of local people.
Action
The simplest and most reliable way to mitigate damage from oil operations would be to prohibit oil extraction in the tropical rainforest. But that is unlikely given the
number of tropical countries that produce oil and the wealth of oil deposits located in forest areas. Thus the focus is on reducing pollution and avoiding spills through better
pipeline management, reinjection techniques, and halting methane flaring. Limiting road development and restricting access can help avoid deforestation associated with
settlement.
Biofuels
The energy and technology sectors are investing heavily in alternatives to conventional fossil fuels, but early efforts to use crop-based biofuels have had serious
environmental consequences.
While some believed biofuels—fuels that are derived from biomass, including recently living organisms like plants or their metabolic byproducts like cow manure— would
offer environmental benefits over conventional fossils fuels, the production and use of biofuels derived from palm oil, soy, corn, rapeseed, and sugar cane have in recent
years driven up food prices, promoted large-scale deforestation, depleted water supplies, worsened soil erosion, and lead to increased air and water pollution. Still, there
is hope that the next generation of biofuels, derived from farm waste, algae, and native grasses and weeds, could eliminate many of the worse effects seen during the
current rush into biofuels.
Efficiency
Good old-fashioned oil conservation is effective in reducing demand for oil products. After the first OPEC embargo in 1973, the United States realized the importance of oil
efficiency and initiated policies to do away with wasteful practices. By 1985, the U.S. was 25 percent more energy efficient and 32 percent more oil efficient than in 1973.
Of course the U.S. was upstaged by the Japanese who in the same period improved their energy efficiency by 31 percent and their oil efficiency by 51 percent. Today the
importance of oil to the economy is still diminishing. Despite the 51 percent growth in the American economy between 1990 and 2004, carbon emissions only increased
19% suggesting that those who insist that economic growth and carbon dioxide emissions move in tandem are wrong.
The developed world can seek alternative methods to oil exploration, by developing new technologies that rely less on processes that are ecologically damaging. For
example, compressed natural gas is a cleaner-burning fuel than gasoline, is already used in some cars, and is available in vast quantities. Electric cars are potentially even
more environmentally sound.
To encourage investment in research and development of “greener” technologies, governments can help by eliminating subsidies for the oil and gas industry and imposing
higher taxes on heavy polluters. While governments will play a role in cleaner-energy development, it is likely that the private sector will provide most of the funding and
innovation for new energy projects. Venture capital firms and corporations have put billions into new technologies since the mid-2000s, while corporations are getting on
board as well.
As experiences with biofuels have shown, there are often downsides to alternative energy sources. For example, hydroelectric projects have destroyed river systems and
flooded vast areas of forests. Thus when undertaking any large-scale energy project — whether it’s wind, solar, tidal, geothermal, or something else — it is important to
conduct a proper assessment of its impact.
Conclusion
Admittedly, there are many challenges facing sustainable use of tropical rainforests. In arriving at a solution many issues must be addressed, including the resolution of
conflicting claims to land considered to be in the public domain; barriers to markets; the assurance of sustainable development without overexploitation in the face of
growing demand for forest products; determination of the best way to use forests; and the consideration of many other factors.
Almost none of these economic possibilities can become realities if the rainforests are completely stripped. Useful products cannot be harvested from species that no
longer exist, just as eco-tourists will not visit the vast stretches of wasteland that were once lush forest. Thus some of the primary rainforests must be salvaged for
sustainable development to be at all successful.
a) Japan became more energy efficient than the USA in the 1973-1985 period.
b) Both the USA and Japan became equally energy efficient in the 1973-1985 period.
c) The USA became more energy efficient than Japan in the 1973-1985 period.
d) The USA became the most energy efficient country in the world in the 1973-1985 period.
e) The USA did not become more energy efficient during the 1973-1985 period.
www.tecconcursos.com.br/questoes/565191
Oil
Overview
The oil industry has a less-than-stellar environmental record in general, but it becomes even worse in tropical rainforest regions, which often contain rich deposits of
petroleum. The most notorious examples of rainforest havoc caused by oil firms are Shell Oil in Nigeria and Texaco in Ecuador. The operations run by both companies
degraded the environment and affected local and indigenous people by their activities. The Texaco operation in Ecuador was responsible for spilling some 17 million gallons
of oil into the biologically rich tributaries of the upper Amazon, while in the 1980s and 1990s Shell Oil cooperated with the oppressive military dictatorship in Nigeria in the
suppression and harassment of local people.
Action
The simplest and most reliable way to mitigate damage from oil operations would be to prohibit oil extraction in the tropical rainforest. But that is unlikely given the
number of tropical countries that produce oil and the wealth of oil deposits located in forest areas. Thus the focus is on reducing pollution and avoiding spills through better
pipeline management, reinjection techniques, and halting methane flaring. Limiting road development and restricting access can help avoid deforestation associated with
settlement.
Biofuels
The energy and technology sectors are investing heavily in alternatives to conventional fossil fuels, but early efforts to use crop-based biofuels have had serious
environmental consequences.
While some believed biofuels—fuels that are derived from biomass, including recently living organisms like plants or their metabolic byproducts like cow manure— would
offer environmental benefits over conventional fossils fuels, the production and use of biofuels derived from palm oil, soy, corn, rapeseed, and sugar cane have in recent
years driven up food prices, promoted large-scale deforestation, depleted water supplies, worsened soil erosion, and lead to increased air and water pollution. Still, there
is hope that the next generation of biofuels, derived from farm waste, algae, and native grasses and weeds, could eliminate many of the worse effects seen during the
current rush into biofuels.
Efficiency
Good old-fashioned oil conservation is effective in reducing demand for oil products. After the first OPEC embargo in 1973, the United States realized the importance of oil
efficiency and initiated policies to do away with wasteful practices. By 1985, the U.S. was 25 percent more energy efficient and 32 percent more oil efficient than in 1973.
Of course the U.S. was upstaged by the Japanese who in the same period improved their energy efficiency by 31 percent and their oil efficiency by 51 percent. Today the
importance of oil to the economy is still diminishing. Despite the 51 percent growth in the American economy between 1990 and 2004, carbon emissions only increased
19% suggesting that those who insist that economic growth and carbon dioxide emissions move in tandem are wrong.
The developed world can seek alternative methods to oil exploration, by developing new technologies that rely less on processes that are ecologically damaging. For
example, compressed natural gas is a cleaner-burning fuel than gasoline, is already used in some cars, and is available in vast quantities. Electric cars are potentially even
more environmentally sound.
To encourage investment in research and development of “greener” technologies, governments can help by eliminating subsidies for the oil and gas industry and imposing
higher taxes on heavy polluters. While governments will play a role in cleaner-energy development, it is likely that the private sector will provide most of the funding and
innovation for new energy projects. Venture capital firms and corporations have put billions into new technologies since the mid-2000s, while corporations are getting on
board as well.
As experiences with biofuels have shown, there are often downsides to alternative energy sources. For example, hydroelectric projects have destroyed river systems and
flooded vast areas of forests. Thus when undertaking any large-scale energy project — whether it’s wind, solar, tidal, geothermal, or something else — it is important to
conduct a proper assessment of its impact.
Conclusion
Admittedly, there are many challenges facing sustainable use of tropical rainforests. In arriving at a solution many issues must be addressed, including the resolution of
conflicting claims to land considered to be in the public domain; barriers to markets; the assurance of sustainable development without overexploitation in the face of
growing demand for forest products; determination of the best way to use forests; and the consideration of many other factors.
Almost none of these economic possibilities can become realities if the rainforests are completely stripped. Useful products cannot be harvested from species that no
longer exist, just as eco-tourists will not visit the vast stretches of wasteland that were once lush forest. Thus some of the primary rainforests must be salvaged for
sustainable development to be at all successful.
The fragment of Text “Despite the 51 percent growth in the American economy between 1990 and 2004, carbon emissions only increased 19% suggesting that those who
insist that economic growth and carbon dioxide emissions move in tandem are wrong” implies that
a) both percent rates are incorrect.
b) those increase rates are independent.
c) the relationship between those rates must be established.
d) the American economy grew 51% from 1990 to 2004, and as a consequence, carbon emissions decreased.
e) the American economy grew 51% from 1990 to 2004 because carbon emission only increased 19%.
www.tecconcursos.com.br/questoes/565194
The refineries in the US processed 230,293 barrels of Amazon crude oil a day in 2015, according to the study conducted by the Amazon Watch, an environmental group.
The demand for crude oil in the US is forcing countries such as Peru, Colombia and Ecuador to expand their oil drilling operations and hence, driving the destruction of the
rainforest ecosystem.
The Amazon rainforest has already become highly vulnerable to forest fires due to selective logging, hunting, altering or fragmenting the landscape and other forms of
habitat degradation. As oil interests lead to aggressive extraction of oil, there are fears that indigenous communities will suffer displacement and acquire deadly illnesses
due to lack of acquired immunity.
Mining and construction of hydroelectric dams, in what is considered the lungs of the world, have already triggered deforestation. The problem is getting even more
confounded when ambitions of large oil firms, including some from China, have come to play its part. According to the study, the proposed oil and gas fields now cover
283,172 sq. miles of the Amazon.
Explaining the “triple carbon impact” of Amazonian oil extraction, the report says that copious amount of carbon is emitted when the rainforest is cut down to establish drill
sites. Further destruction of the world’s largest carbon sink takes place when necessary roads and other infrastructure are constructed. More carbon emissions are
experienced when the oil is ultimately burned for energy.
Ecuador
Ecuador’s state-run oil firm PetroAmazonas has started drilling close to the Yasuni National Park— one of the most biologically rich places in the world. It is home to 655
endemic tree species—more than the US and Canada combined—and two of the last tribes in the world. While extraction of 23,000 barrels of oil a day began in
September, critics raise concern over a possibility of oil destroying Yasuní the way it caused widespread deforestation and pollution in rest of Ecuador and the western
Amazon. Apart from the risk of water and soil contamination, the construction of roads deep into the forest would lead to hunting and deforestation. The environmentalists
also fear an inevitable conflict for land between oil workers and the semi-nomadic tribes of Waorani Indians living within the park for generations.
Peru
The viability of oil production in Peruvian Amazon has been questioned time and again. In the past two months, the protest by Peruvian Amazonian indigenous communities
against oil pollution on their lands has grown stronger. Till October, nine cases of pipeline spill have been reported.
The indigenous communities, who witnessed the latest pipeline spill on October 23, are now demanding a national debate on whether oil drilling should continue in the
Peruvian Amazon that plays a crucial role in regulating global climate and hydrological patterns.
The protesters want a state of emergency to be declared in two districts of the lower Marañón Valley where five indigenous communities have been affected by a series of
oil spills.
While there has been an overall decline in US crude imports, the imports from the Amazon are on the rise. The US is now importing more crude oil from the Amazon than
from any single foreign country. “Our demand for Amazon crude is driving the expansion of the Amazon oil frontier and is putting millions of acres of indigenous territory
and pristine rainforest on the chopping block,” said Leila Salazar-López, executive director of Amazon Watch.
According to Adam Zuckerman, Amazon Watch’s End Amazon Crude campaign manager, “All commercial and public fleets in California—and many across the US—that buy
bulk diesel are using fuel that is at least partially derived from Amazon crude.”
The report also revealed that California refines an average of 170,978 barrels of Amazon crude a day with the Chevron facility in El Segundo refining 24 per cent of the US
alone. After crude is refined, it is distributed as diesel to vehicle fleets across the US.
Interestingly, California Environment Protection Agency, in 2015, made an unequivocal climate commitment, “In order to meet federal health-based air quality standards
and our climate change goals, we must cut in half the amount of petroleum we use in our cars and trucks over the next 15 years.”
From the fragment of Text “Explaining the ‘triple carbon impact’ of Amazonian oil extraction, the report says that copious amount of carbon is emitted when the rainforest
is cut down to establish drill sites”, the word report refers to
a) this article.
b) the person who wrote the article.
c) Chinese oil firms’ business plan.
d) study produced by an environmental group.
e) written piece issued by hydroelectric power plants.
www.tecconcursos.com.br/questoes/565195
The refineries in the US processed 230,293 barrels of Amazon crude oil a day in 2015, according to the study conducted by the Amazon Watch, an environmental group.
The demand for crude oil in the US is forcing countries such as Peru, Colombia and Ecuador to expand their oil drilling operations and hence, driving the destruction of the
rainforest ecosystem.
https://w w w .tecconcursos.com.br/caderno/Q3QQDf/imprimir 57/137
23/03/2024, 16:34 Tec Concursos - Questões para concursos, provas, editais, simulados.
The Amazon rainforest has already become highly vulnerable to forest fires due to selective logging, hunting, altering or fragmenting the landscape and other forms of
habitat degradation. As oil interests lead to aggressive extraction of oil, there are fears that indigenous communities will suffer displacement and acquire deadly illnesses
due to lack of acquired immunity.
Mining and construction of hydroelectric dams, in what is considered the lungs of the world, have already triggered deforestation. The problem is getting even more
confounded when ambitions of large oil firms, including some from China, have come to play its part. According to the study, the proposed oil and gas fields now cover
283,172 sq. miles of the Amazon.
Explaining the “triple carbon impact” of Amazonian oil extraction, the report says that copious amount of carbon is emitted when the rainforest is cut down to establish drill
sites. Further destruction of the world’s largest carbon sink takes place when necessary roads and other infrastructure are constructed. More carbon emissions are
experienced when the oil is ultimately burned for energy.
Ecuador
Ecuador’s state-run oil firm PetroAmazonas has started drilling close to the Yasuni National Park— one of the most biologically rich places in the world. It is home to 655
endemic tree species—more than the US and Canada combined—and two of the last tribes in the world. While extraction of 23,000 barrels of oil a day began in
September, critics raise concern over a possibility of oil destroying Yasuní the way it caused widespread deforestation and pollution in rest of Ecuador and the western
Amazon. Apart from the risk of water and soil contamination, the construction of roads deep into the forest would lead to hunting and deforestation. The environmentalists
also fear an inevitable conflict for land between oil workers and the semi-nomadic tribes of Waorani Indians living within the park for generations.
Peru
The viability of oil production in Peruvian Amazon has been questioned time and again. In the past two months, the protest by Peruvian Amazonian indigenous communities
against oil pollution on their lands has grown stronger. Till October, nine cases of pipeline spill have been reported.
The indigenous communities, who witnessed the latest pipeline spill on October 23, are now demanding a national debate on whether oil drilling should continue in the
Peruvian Amazon that plays a crucial role in regulating global climate and hydrological patterns.
The protesters want a state of emergency to be declared in two districts of the lower Marañón Valley where five indigenous communities have been affected by a series of
oil spills.
While there has been an overall decline in US crude imports, the imports from the Amazon are on the rise. The US is now importing more crude oil from the Amazon than
from any single foreign country. “Our demand for Amazon crude is driving the expansion of the Amazon oil frontier and is putting millions of acres of indigenous territory
and pristine rainforest on the chopping block,” said Leila Salazar-López, executive director of Amazon Watch.
According to Adam Zuckerman, Amazon Watch’s End Amazon Crude campaign manager, “All commercial and public fleets in California—and many across the US—that buy
bulk diesel are using fuel that is at least partially derived from Amazon crude.”
The report also revealed that California refines an average of 170,978 barrels of Amazon crude a day with the Chevron facility in El Segundo refining 24 per cent of the US
alone. After crude is refined, it is distributed as diesel to vehicle fleets across the US.
Interestingly, California Environment Protection Agency, in 2015, made an unequivocal climate commitment, “In order to meet federal health-based air quality standards
and our climate change goals, we must cut in half the amount of petroleum we use in our cars and trucks over the next 15 years.”
From the 9th paragraph of Text, one can conclude that in recent overall figures, the US import of crude oil
a) has soared.
b) has ceased.
c) has kept its previous levels.
d) has spread its fuel source areas.
e) has concentrated its fuel source areas.
www.tecconcursos.com.br/questoes/348295
CESGRANRIO - TA (ANP)/ANP/2016
Língua Inglesa (Inglês) - Interpretação de Textos (Understanding)
88) Low Oil Prices Could Be Good
for Electricity and Renewables
By Robert Fares
Since I first wrote about the price of oil last December, the global oil price has fallen to levels not seen in over five years. For many, the recent price decline brings back
memories of the 1980s oil price collapse, which followed the 70s oil price spike and drew attention away from renewable energy and other alternatives — famously
prompting U.S. President Ronald Reagan to remove the White House solar panels that had been installed by the previous administration.
Thankfully, this time around, the outlook for renewable energy isn’t so bleak. In fact, it is possible low oil prices could actually improve the economics of renewable energy.
It all comes down to the relationship between oil and gas production and the price of electricity, which directly affects the bottom line of technologies like wind and solar.
In 1973, the year the Arab Oil Embargo caused a steep rise in oil prices, the United States produced 17 percent of its electricity using petroleum. When the oil price
increased, the price of electricity increased too. This increase in price prompted greater interest in domestic sources of electricity, like coal, nuclear, and renewable energy.
Due in part to the turn away from oil in the 70s, today the United States produces just 0.7 percent of its electricity using petroleum. Therefore, the price of oil has no direct
impact on the price of electricity. Most electricity comes from coal (39 percent) and natural gas (27 percent), with the remainder coming from nuclear, hydroelectric, wind,
and other renewables. The fuel with the most direct impact on the price of electricity is natural gas, because natural gas generation often sets the price of electricity in the
market. To gauge how low oil prices might affect the price of electricity, it’s really important to think about how they might affect the price of natural gas.
Although oil and natural gas prices have decoupled in recent years, there is still an indirect link between the price of oil and the price of natural gas, because both oil and
natural gas are often produced from the same well. While most U.S. natural gas is produced from wells drilled for the express purpose of extracting gas, a portion comes
These data show that, while overall Texas natural gas production has increased since 2008, the amount of gas produced from purpose-drilled gas wells has actually
declined. On the other hand, natural gas associated with oil production has increased markedly since 2008.
b) introduce the idea that the low prices of oil can be positive for electricity and renewables.
c) defend the position of those who see no connection between the prices of oil and the electric market.
d) discuss the position of the Reagan government in relation to oil prices in the 80s.
e) attack those who believe that the prices of oil should increase.
www.tecconcursos.com.br/questoes/348307
CESGRANRIO - TA (ANP)/ANP/2016
Língua Inglesa (Inglês) - Interpretação de Textos (Understanding)
89) Low Oil Prices Could Be Good
for Electricity and Renewables
By Robert Fares
Since I first wrote about the price of oil last December, the global oil price has fallen to levels not seen in over five years. For many, the recent price decline brings back
memories of the 1980s oil price collapse, which followed the 70s oil price spike and drew attention away from renewable energy and other alternatives — famously
prompting U.S. President Ronald Reagan to remove the White House solar panels that had been installed by the previous administration.
Thankfully, this time around, the outlook for renewable energy isn’t so bleak. In fact, it is possible low oil prices could actually improve the economics of renewable energy.
It all comes down to the relationship between oil and gas production and the price of electricity, which directly affects the bottom line of technologies like wind and solar.
In 1973, the year the Arab Oil Embargo caused a steep rise in oil prices, the United States produced 17 percent of its electricity using petroleum. When the oil price
increased, the price of electricity increased too. This increase in price prompted greater interest in domestic sources of electricity, like coal, nuclear, and renewable energy.
Due in part to the turn away from oil in the 70s, today the United States produces just 0.7 percent of its electricity using petroleum. Therefore, the price of oil has no direct
impact on the price of electricity. Most electricity comes from coal (39 percent) and natural gas (27 percent), with the remainder coming from nuclear, hydroelectric, wind,
and other renewables. The fuel with the most direct impact on the price of electricity is natural gas, because natural gas generation often sets the price of electricity in the
market. To gauge how low oil prices might affect the price of electricity, it’s really important to think about how they might affect the price of natural gas.
Although oil and natural gas prices have decoupled in recent years, there is still an indirect link between the price of oil and the price of natural gas, because both oil and
natural gas are often produced from the same well. While most U.S. natural gas is produced from wells drilled for the express purpose of extracting gas, a portion comes
from wells that are drilled to extract oil, but produce natural gas as a byproduct. This “associated gas” or “casinghead gas” is often flared in regions like the Bakken in
North Dakota, which has limited pipeline infrastructure. However, in regions like Texas’s Eagle Ford and Permian Basin, this gas is often injected into the existing pipeline
network. Because drillers are really after the more-valuable oil, associated natural gas is often simply dumped into the pipelines at little or no cost — depressing the
overall price of natural gas. The Railroad Commission of Texas, which regulates the oil and gas industry, collects separate data on natural gas produced from gas wells and
natural gas produced as a byproduct from oil wells.
These data show that, while overall Texas natural gas production has increased since 2008, the amount of gas produced from purpose-drilled gas wells has actually
declined. On the other hand, natural gas associated with oil production has increased markedly since 2008.
From the fragment of the text “Although oil and natural gas prices have decoupled in recent years, there is still an indirect link between the price of oil and the price of
natural gas, because both oil and natural gas are often produced from the same well”, it can be inferred that
a) oil and natural gas are seldom extracted from the same wells.
b) oil and natural gas produced from the same well have their prices often determined by government decisions.
c) oil and natural gas extracted from the same wells bring as an effect an indirect link between their prices.
d) oil and natural gas prices have been increasingly independent in recent years because they are often produced from the same well.
e) oil and natural gas prices have been increasingly dependent in recent years because they are often produced from the same well.
www.tecconcursos.com.br/questoes/348512
President Obama announced Friday morning that he has denied TransCanada’s permit application to build the Keystone XL oil pipeline in the U.S.
“The State Department has decided that the Keystone XL pipeline would not serve the national interest of the United States,” Obama said. “I agree with that decision.”
Among the reasons for rejecting Keystone XL, Obama said the pipeline would not make a meaningful long-term contribution to the U.S. economy, nor would it increase
U.S. energy security or help to lower gas prices, which have already declined dramatically over the last year.
TransCanada said in a statement that it “would review all of its options in light of a permit denial for Keystone XL,” including the possibility of filing a new permit application
for a pipeline.
“TransCanada and its shippers remain absolutely committed to building this important energy infrastructure project,” TransCanada CEO Russ Girling said in a statement.
State Department officials said at a news conference Friday that TransCanada is free to apply for a new permit to build a cross-border pipeline and it is up to the company
to do so.
The $8 billion Keystone XL pipeline was slated to stretch 1,179 miles from east-central Alberta, Canada, to the Texas Gulf Coast. It would transport 830,000 barrels of
crude oil per day from the Canadian tar sands to refineries near Houston. Proposed in 2008, the 875-mile section between the Canadian border and Steele City, Neb.,
needed State Department approval because it crossed an international border.
Other parts of TransCanada’s Keystone Project between central Nebraska and Texas have already been built and are carrying tar sands oil to refineries along the Gulf
Coast today. Environmental advocates have rallied against the unbuilt portion and urged the Obama administration to reject it, saying emissions from the production and
burning of tar sands oil it would carry could worsen climate change.
The U.S. Environmental Protection Agency calculated that the tar sands oil the pipeline would carry is highly damaging to the climate, emitting about 1.3 billion more tons
of greenhouse emissions over the pipeline’s 50-year lifespan than if it were carrying conventional crude oil. The production of tar sands oil releases 17 percent more CO2
into the atmosphere than the average barrel of crude oil produced elsewhere, according to the State Department.
“Construction of the Keystone XL pipeline would be inconsistent with stabilizing global warming below dangerous levels,” Penn State University climate scientist Michael
Mann said. “I am pleased that the administration has made good on their promise to take seriously the task of acting on climate by rejecting the construction of the
pipeline.”
Available at: <http://www.scientifi camerican.com/article/obamarejects-keystone-xl-pipeline/>. Retrieved on: Nov. 10th, 2015. Adapted
c) present the reasons why the American government is against the construction of Keystone XL Pipeline through the American territory.
www.tecconcursos.com.br/questoes/348516
President Obama announced Friday morning that he has denied TransCanada’s permit application to build the Keystone XL oil pipeline in the U.S.
“The State Department has decided that the Keystone XL pipeline would not serve the national interest of the United States,” Obama said. “I agree with that decision.”
Obama said America is a global leader on taking action on climate change, and approving Keystone XL would have undercut that leadership. Some crude oil needs to be
left in the ground to keep the climate from warming further, and rejecting Keystone XL will help meet that goal, he said.
Among the reasons for rejecting Keystone XL, Obama said the pipeline would not make a meaningful long-term contribution to the U.S. economy, nor would it increase
U.S. energy security or help to lower gas prices, which have already declined dramatically over the last year.
TransCanada said in a statement that it “would review all of its options in light of a permit denial for Keystone XL,” including the possibility of filing a new permit application
for a pipeline.
“TransCanada and its shippers remain absolutely committed to building this important energy infrastructure project,” TransCanada CEO Russ Girling said in a statement.
State Department officials said at a news conference Friday that TransCanada is free to apply for a new permit to build a cross-border pipeline and it is up to the company
to do so.
The $8 billion Keystone XL pipeline was slated to stretch 1,179 miles from east-central Alberta, Canada, to the Texas Gulf Coast. It would transport 830,000 barrels of
crude oil per day from the Canadian tar sands to refineries near Houston. Proposed in 2008, the 875-mile section between the Canadian border and Steele City, Neb.,
needed State Department approval because it crossed an international border.
Other parts of TransCanada’s Keystone Project between central Nebraska and Texas have already been built and are carrying tar sands oil to refineries along the Gulf
Coast today. Environmental advocates have rallied against the unbuilt portion and urged the Obama administration to reject it, saying emissions from the production and
burning of tar sands oil it would carry could worsen climate change.
The U.S. Environmental Protection Agency calculated that the tar sands oil the pipeline would carry is highly damaging to the climate, emitting about 1.3 billion more tons
of greenhouse emissions over the pipeline’s 50-year lifespan than if it were carrying conventional crude oil. The production of tar sands oil releases 17 percent more CO2
into the atmosphere than the average barrel of crude oil produced elsewhere, according to the State Department.
“Construction of the Keystone XL pipeline would be inconsistent with stabilizing global warming below dangerous levels,” Penn State University climate scientist Michael
Mann said. “I am pleased that the administration has made good on their promise to take seriously the task of acting on climate by rejecting the construction of the
pipeline.”
Available at: <http://www.scientifi camerican.com/article/obamarejects-keystone-xl-pipeline/>. Retrieved on: Nov. 10th, 2015. Adapted
b) the United States is trying to achieve leadership on taking action on climate change.
c) according to the American government, the construction of the pipeline causes mild impact on the climate.
d) the American government sees no relation between the construction of the Keystone XL and climate change.
e) the approval of the Keystone XL would contradict American concerns with climate change.
www.tecconcursos.com.br/questoes/348522
President Obama announced Friday morning that he has denied TransCanada’s permit application to build the Keystone XL oil pipeline in the U.S.
“The State Department has decided that the Keystone XL pipeline would not serve the national interest of the United States,” Obama said. “I agree with that decision.”
Obama said America is a global leader on taking action on climate change, and approving Keystone XL would have undercut that leadership. Some crude oil needs to be
left in the ground to keep the climate from warming further, and rejecting Keystone XL will help meet that goal, he said.
Among the reasons for rejecting Keystone XL, Obama said the pipeline would not make a meaningful long-term contribution to the U.S. economy, nor would it increase
U.S. energy security or help to lower gas prices, which have already declined dramatically over the last year.
TransCanada said in a statement that it “would review all of its options in light of a permit denial for Keystone XL,” including the possibility of filing a new permit application
for a pipeline.
“TransCanada and its shippers remain absolutely committed to building this important energy infrastructure project,” TransCanada CEO Russ Girling said in a statement.
State Department officials said at a news conference Friday that TransCanada is free to apply for a new permit to build a cross-border pipeline and it is up to the company
to do so.
The $8 billion Keystone XL pipeline was slated to stretch 1,179 miles from east-central Alberta, Canada, to the Texas Gulf Coast. It would transport 830,000 barrels of
crude oil per day from the Canadian tar sands to refineries near Houston. Proposed in 2008, the 875-mile section between the Canadian border and Steele City, Neb.,
needed State Department approval because it crossed an international border.
Other parts of TransCanada’s Keystone Project between central Nebraska and Texas have already been built and are carrying tar sands oil to refineries along the Gulf
Coast today. Environmental advocates have rallied against the unbuilt portion and urged the Obama administration to reject it, saying emissions from the production and
burning of tar sands oil it would carry could worsen climate change.
The U.S. Environmental Protection Agency calculated that the tar sands oil the pipeline would carry is highly damaging to the climate, emitting about 1.3 billion more tons
of greenhouse emissions over the pipeline’s 50-year lifespan than if it were carrying conventional crude oil. The production of tar sands oil releases 17 percent more CO2
into the atmosphere than the average barrel of crude oil produced elsewhere, according to the State Department.
“Construction of the Keystone XL pipeline would be inconsistent with stabilizing global warming below dangerous levels,” Penn State University climate scientist Michael
Mann said. “I am pleased that the administration has made good on their promise to take seriously the task of acting on climate by rejecting the construction of the
pipeline.”
Available at: <http://www.scientifi camerican.com/article/obamarejects-keystone-xl-pipeline/>. Retrieved on: Nov. 10th, 2015. Adapted
After reading the 10th paragraph of the text, one can infer that
a) the pipeline would release 1.3 billion tons of greenhouse emissions in 50 years if it carried tar sands oil.
b) the pipeline would release 1.3 billion tons of greenhouse emissions in 50 years if it carried conventional crude oil.
c) the pipeline would release the same volume of greenhouse emissions in 50 years no matter what kind of oil it carried.
d) greenhouse emissions would be increased in about 1.3 billion tons in 50 years if the pipeline carried tar sands oil.
e) greenhouse emissions would be increased in about 1.3 billion tons in 50 years if the pipeline carried conventional crude oil.
www.tecconcursos.com.br/questoes/386544
The world is obviously not a place where features such as resources, people and economic activities are randomly distributed; there is a logic, or an order, to spatial
distribution. Geography seeks to understand the spatial order of things as well as their interactions, particularly when the spatial order is less evident. Transportation is one
element of this spatial order as it is at the same time influenced by geography as well as having an influence on it. For instance, the path followed by a road is influenced
by regional economic and physical attributes, but once constructed the same road will shape future regional developments.
Transportation is of relevance to geography for two main reasons. First, transport infrastructures, terminals, modes and networks occupy an important place in space and
constitute the basis of a complex spatial system. Second, since geography seeks to explain spatial relationships, transport networks are of specific interest because they
are the main physical support of these interactions.
In the 1960s, transport had to be formalized as key factors in location theories and transport geography began to rely increasingly on quantitative methods, particularly
over network and spatial interactions analysis. However, from the 1970s, technical, political and economic changes challenged the centrality of transportation in many
geographical and regional development investigations. The strong spatial anchoring effect of high transportation costs receded and decentralization was a dominant
paradigm that was observed within cities (suburbanization), but also within regions. The spatial theory foundations of transport geography, particularly the friction of
distance, became less relevant, or less evident, in explaining socioeconomic processes. As a result, transportation became underrepresented in economic geography in the
1970s and 1980s, even if the mobility of people and freight and low transport costswere considered as important factors behind the globalization of trade and production.
Since the 1990s, transport geography has received renewed attention with new realms of investigation. The issues of mobility, production and distribution became
interrelated in a complex geographical setting where the local, regional and global became increasingly blurred through the development of new passengers and freight
transport systems (Hoyle and Knowles, 1998). For instance, suburbanization resulted in an array of challenges related to congestion and automobile dependency. Rapid
urbanization in developing economies underlined the challenges of transport infrastructure investment for private as well as collective uses. Globalization supported the
development of complex air and maritime transportation networks, many of which supporting global supply chains and trade relations across long distances. The role of
information and communication technologies was also being felt, often as a support or as an alternative to mobility. All of the above were linked with new and expanded
mobilities of passengers, freight and information.
www.tecconcursos.com.br/questoes/386545
The world is obviously not a place where features such as resources, people and economic activities are randomly distributed; there is a logic, or an order, to spatial
distribution. Geography seeks to understand the spatial order of things as well as their interactions, particularly when the spatial order is less evident. Transportation is one
element of this spatial order as it is at the same time influenced by geography as well as having an influence on it. For instance, the path followed by a road is influenced
by regional economic and physical attributes, but once constructed the same road will shape future regional developments.
Transportation is of relevance to geography for two main reasons. First, transport infrastructures, terminals, modes and networks occupy an important place in space and
constitute the basis of a complex spatial system. Second, since geography seeks to explain spatial relationships, transport networks are of specific interest because they
are the main physical support of these interactions.
Transport geography, as a discipline, emerged as a branch of economic geography in the second half of the twentieth century. In earlier considerations, particularly in
commercial geography (late 19th and early 20th century), transportation was an important factor behind the economic representations of the geographic space, namely in
terms of the location of economic activities and the monetary costs of distance. These cost considerations became the foundation of several geographical theories such as
central places and location analysis. The growing mobility of passengers and freight justified the emergence of transport geography as a specialized field of investigation.
In the 1960s, transport had to be formalized as key factors in location theories and transport geography began to rely increasingly on quantitative methods, particularly
over network and spatial interactions analysis. However, from the 1970s, technical, political and economic changes challenged the centrality of transportation in many
geographical and regional development investigations. The strong spatial anchoring effect of high transportation costs receded and decentralization was a dominant
paradigm that was observed within cities (suburbanization), but also within regions. The spatial theory foundations of transport geography, particularly the friction of
distance, became less relevant, or less evident, in explaining socioeconomic processes. As a result, transportation became underrepresented in economic geography in the
1970s and 1980s, even if the mobility of people and freight and low transport costswere considered as important factors behind the globalization of trade and production.
Since the 1990s, transport geography has received renewed attention with new realms of investigation. The issues of mobility, production and distribution became
interrelated in a complex geographical setting where the local, regional and global became increasingly blurred through the development of new passengers and freight
transport systems (Hoyle and Knowles, 1998). For instance, suburbanization resulted in an array of challenges related to congestion and automobile dependency. Rapid
urbanization in developing economies underlined the challenges of transport infrastructure investment for private as well as collective uses. Globalization supported the
development of complex air and maritime transportation networks, many of which supporting global supply chains and trade relations across long distances. The role of
information and communication technologies was also being felt, often as a support or as an alternative to mobility. All of the above were linked with new and expanded
mobilities of passengers, freight and information.
The text points out two main reasons why transportation is of relevance to geography.
www.tecconcursos.com.br/questoes/386547
Transportation is of relevance to geography for two main reasons. First, transport infrastructures, terminals, modes and networks occupy an important place in space and
constitute the basis of a complex spatial system. Second, since geography seeks to explain spatial relationships, transport networks are of specific interest because they
are the main physical support of these interactions.
Transport geography, as a discipline, emerged as a branch of economic geography in the second half of the twentieth century. In earlier considerations, particularly in
commercial geography (late 19th and early 20th century), transportation was an important factor behind the economic representations of the geographic space, namely in
terms of the location of economic activities and the monetary costs of distance. These cost considerations became the foundation of several geographical theories such as
central places and location analysis. The growing mobility of passengers and freight justified the emergence of transport geography as a specialized field of investigation.
In the 1960s, transport had to be formalized as key factors in location theories and transport geography began to rely increasingly on quantitative methods, particularly
over network and spatial interactions analysis. However, from the 1970s, technical, political and economic changes challenged the centrality of transportation in many
geographical and regional development investigations. The strong spatial anchoring effect of high transportation costs receded and decentralization was a dominant
paradigm that was observed within cities (suburbanization), but also within regions. The spatial theory foundations of transport geography, particularly the friction of
distance, became less relevant, or less evident, in explaining socioeconomic processes. As a result, transportation became underrepresented in economic geography in the
1970s and 1980s, even if the mobility of people and freight and low transport costswere considered as important factors behind the globalization of trade and production.
Since the 1990s, transport geography has received renewed attention with new realms of investigation. The issues of mobility, production and distribution became
interrelated in a complex geographical setting where the local, regional and global became increasingly blurred through the development of new passengers and freight
transport systems (Hoyle and Knowles, 1998). For instance, suburbanization resulted in an array of challenges related to congestion and automobile dependency. Rapid
urbanization in developing economies underlined the challenges of transport infrastructure investment for private as well as collective uses. Globalization supported the
development of complex air and maritime transportation networks, many of which supporting global supply chains and trade relations across long distances. The role of
information and communication technologies was also being felt, often as a support or as an alternative to mobility. All of the above were linked with new and expanded
mobilities of passengers, freight and information.
According to the text, the emergence of transport geography as a specialized field of investigation is justified by the
a) growing mobility of passengers and freight.
b) idea that the world is not a place where such features are randomly distributed.
c) fact that geography seeks to understand the spatial order of things.
d) fact that cost considerations became the foundation of several geographical theories.
e) fact that transportation was an important issue behind the economic representations of the geographic space.
www.tecconcursos.com.br/questoes/386550
The world is obviously not a place where features such as resources, people and economic activities are randomly distributed; there is a logic, or an order, to spatial
distribution. Geography seeks to understand the spatial order of things as well as their interactions, particularly when the spatial order is less evident. Transportation is one
element of this spatial order as it is at the same time influenced by geography as well as having an influence on it. For instance, the path followed by a road is influenced
by regional economic and physical attributes, but once constructed the same road will shape future regional developments.
Transportation is of relevance to geography for two main reasons. First, transport infrastructures, terminals, modes and networks occupy an important place in space and
constitute the basis of a complex spatial system. Second, since geography seeks to explain spatial relationships, transport networks are of specific interest because they
are the main physical support of these interactions.
Transport geography, as a discipline, emerged as a branch of economic geography in the second half of the twentieth century. In earlier considerations, particularly in
commercial geography (late 19th and early 20th century), transportation was an important factor behind the economic representations of the geographic space, namely in
terms of the location of economic activities and the monetary costs of distance. These cost considerations became the foundation of several geographical theories such as
central places and location analysis. The growing mobility of passengers and freight justified the emergence of transport geography as a specialized field of investigation.
In the 1960s, transport had to be formalized as key factors in location theories and transport geography began to rely increasingly on quantitative methods, particularly
over network and spatial interactions analysis. However, from the 1970s, technical, political and economic changes challenged the centrality of transportation in many
geographical and regional development investigations. The strong spatial anchoring effect of high transportation costs receded and decentralization was a dominant
paradigm that was observed within cities (suburbanization), but also within regions. The spatial theory foundations of transport geography, particularly the friction of
distance, became less relevant, or less evident, in explaining socioeconomic processes. As a result, transportation became underrepresented in economic geography in the
1970s and 1980s, even if the mobility of people and freight and low transport costswere considered as important factors behind the globalization of trade and production.
Since the 1990s, transport geography has received renewed attention with new realms of investigation. The issues of mobility, production and distribution became
interrelated in a complex geographical setting where the local, regional and global became increasingly blurred through the development of new passengers and freight
transport systems (Hoyle and Knowles, 1998). For instance, suburbanization resulted in an array of challenges related to congestion and automobile dependency. Rapid
urbanization in developing economies underlined the challenges of transport infrastructure investment for private as well as collective uses. Globalization supported the
development of complex air and maritime transportation networks, many of which supporting global supply chains and trade relations across long distances. The role of
information and communication technologies was also being felt, often as a support or as an alternative to mobility. All of the above were linked with new and expanded
mobilities of passengers, freight and information.
From the fragment of the text “However, from the 1970s, technical, political and economic changes challenged the centrality of transportation in many geographical and
regional development investigations. The strong spatial anchoring effect of high transportation costs receded and decentralization was a dominant paradigm that was
observed within cities (suburbanization), but also within regions.”, it can be inferred that
a) suburbanization emerged because the spatial anchoring effect of transportation costs increased.
b) transportation maintained its centrality because of technical, political and economic changes in the 1970s.
c) decentralization became the prevailing model in the urban and regional development in the 1970s.
d) the technical, political and economic changes in the 1970s resulted in a transportation crisis.
e) transportation costs had a negative effect in the urban and regional development in the 1970s.
www.tecconcursos.com.br/questoes/386554
The world is obviously not a place where features such as resources, people and economic activities are randomly distributed; there is a logic, or an order, to spatial
distribution. Geography seeks to understand the spatial order of things as well as their interactions, particularly when the spatial order is less evident. Transportation is one
element of this spatial order as it is at the same time influenced by geography as well as having an influence on it. For instance, the path followed by a road is influenced
by regional economic and physical attributes, but once constructed the same road will shape future regional developments.
Transportation is of relevance to geography for two main reasons. First, transport infrastructures, terminals, modes and networks occupy an important place in space and
constitute the basis of a complex spatial system. Second, since geography seeks to explain spatial relationships, transport networks are of specific interest because they
are the main physical support of these interactions.
Transport geography, as a discipline, emerged as a branch of economic geography in the second half of the twentieth century. In earlier considerations, particularly in
commercial geography (late 19th and early 20th century), transportation was an important factor behind the economic representations of the geographic space, namely in
terms of the location of economic activities and the monetary costs of distance. These cost considerations became the foundation of several geographical theories such as
central places and location analysis. The growing mobility of passengers and freight justified the emergence of transport geography as a specialized field of investigation.
In the 1960s, transport had to be formalized as key factors in location theories and transport geography began to rely increasingly on quantitative methods, particularly
over network and spatial interactions analysis. However, from the 1970s, technical, political and economic changes challenged the centrality of transportation in many
geographical and regional development investigations. The strong spatial anchoring effect of high transportation costs receded and decentralization was a dominant
paradigm that was observed within cities (suburbanization), but also within regions. The spatial theory foundations of transport geography, particularly the friction of
distance, became less relevant, or less evident, in explaining socioeconomic processes. As a result, transportation became underrepresented in economic geography in the
1970s and 1980s, even if the mobility of people and freight and low transport costswere considered as important factors behind the globalization of trade and production.
Since the 1990s, transport geography has received renewed attention with new realms of investigation. The issues of mobility, production and distribution became
interrelated in a complex geographical setting where the local, regional and global became increasingly blurred through the development of new passengers and freight
transport systems (Hoyle and Knowles, 1998). For instance, suburbanization resulted in an array of challenges related to congestion and automobile dependency. Rapid
urbanization in developing economies underlined the challenges of transport infrastructure investment for private as well as collective uses. Globalization supported the
development of complex air and maritime transportation networks, many of which supporting global supply chains and trade relations across long distances. The role of
information and communication technologies was also being felt, often as a support or as an alternative to mobility. All of the above were linked with new and expanded
mobilities of passengers, freight and information.
From the sentence in the text “Since the 1990s, transport geography has received renewed attention with new realms of investigation”, it can be concluded that transport
geography
a) received new realms of investigation at the end of the 1990s.
b) was only studied with new realms of investigation in the 1990s.
c) was only studied with new realms of investigation before the 1990s.
d) was only studied with new realms of investigation at the beginning of the 1990s.
e) started being studied with new realms of investigation at the beginning of the 1990s that are still being applied to its study nowadays.
www.tecconcursos.com.br/questoes/465852
More so than many other fields of business, the maritime industry is focused on cost, which in turn gives the appearance of being conservative towards technology.
Certainly, we have technical ships magnificently operating with equipment that wouldn’t look out of place in a NASA lab, but generally, it can take decades for a technology
to become mainstream. Unless it becomes mandated by the IMO (International Maritime Organization). Vessel tracking is a partial exception to the rule though, with many
fleet owners realizing its potential for more cost-effective operation and personnel security.
Knowing the exact position of all vessels in a fleet, in a software solution designed to fit with your own logistical processes, can significantly improve efficiency. If a ship
arrives early or late, more often than not there will be an associated cost. If this can be identified during transit then the early or late arrival can be negated or at least
planned for. Likewise, if by knowing the positions of your fleet of workboats means that you can route the closest vessel to the next job, then significant fuel cost savings
can be made. With modern tracking systems, the way data is used is just as important as knowing where a vessel is at all times. But there are countless ways to apply the
data to the benefit of efficiency for a single ship or fleet. So providing easy and reliable access to position reports is essential.
RockFLEET is an advanced new tracking unit for the professional maritime environment. During its design phase, the team decided that in order for the position data it
provides to be of the most use, as well as being available via Rock Seven’s own fleet viewer ‘The Core,’ it must also be available in any software system the user chooses.
Using a standards-based API (Application Programming Interface.), the customer can integrate tracking data from RockFLEET into their own applications. Typically this
means that RockFLEET tracked assets can be added to existing fleet management software, which invariably is designed around an owner or operators own logistics.
With precise vessel location data available, the opportunities are unlimited and only down to the creativity of the user. For instance, a current Rock Seven customer uses
location data to manage payroll of personnel. Essentially, personnel get paid different amounts depending on whether the ship is at sea, in international waters, in port or
transiting regions with high piracy incidents.
The above user is a private security company involved in anti-piracy operations. It actually gets location data using RockSTAR, the handheld version of RockFLEET, which is
a new fixed unit that can be fitted anywhere on board. Completely waterproof and with no moving parts, it is a robust, ultra-compact (13cm diameter/4cm high) device
with multiple mounting options. The physical design of RockFLEET was in part driven by the security challenges faced by vessels facing the issues of modern piracy.
Knowing the location of all friendly vessels in a region is vital to organisations with a stake in ensuring safe passage through known piracy hotspots. With an operational
vessel/fleet tracking system, ship owners and fleet managers will know where their ships are at all times. This information can be fed to authorities, private anti-piracy
companies and the naval forces patrolling piracy hotspots to build a clear, near real-time picture for domain awareness. The value of this information should a vessel be
hijacked is obvious: knowing the last whereabouts of a vessel provides responders with a starting point should a hijacked vessel’s tracking system be disabled by pirates.
Today’s pirates know that many commercial vessels are tracked, especially those would be targets sailing in what are known to be hostile waters. So disabling vessel
tracking equipment on board is a sensible action for said pirates after a hijacked ship’s crew have been subdued and because most tracking units are powered by the
vessel, finding and cutting the power supply isn’t hard. RockFLEET, however, is the only device of its kind with an internal battery backup, so it can continue to transmit
position for up to two weeks if external power is cut.
With facility to mount covertly, this makes it especially suitable for vessels traversing piracy hotspots.
www.tecconcursos.com.br/questoes/465853
More so than many other fields of business, the maritime industry is focused on cost, which in turn gives the appearance of being conservative towards technology.
Certainly, we have technical ships magnificently operating with equipment that wouldn’t look out of place in a NASA lab, but generally, it can take decades for a technology
to become mainstream. Unless it becomes mandated by the IMO (International Maritime Organization). Vessel tracking is a partial exception to the rule though, with many
fleet owners realizing its potential for more cost-effective operation and personnel security.
Knowing the exact position of all vessels in a fleet, in a software solution designed to fit with your own logistical processes, can significantly improve efficiency. If a ship
arrives early or late, more often than not there will be an associated cost. If this can be identified during transit then the early or late arrival can be negated or at least
planned for. Likewise, if by knowing the positions of your fleet of workboats means that you can route the closest vessel to the next job, then significant fuel cost savings
can be made. With modern tracking systems, the way data is used is just as important as knowing where a vessel is at all times. But there are countless ways to apply the
data to the benefit of efficiency for a single ship or fleet. So providing easy and reliable access to position reports is essential.
RockFLEET is an advanced new tracking unit for the professional maritime environment. During its design phase, the team decided that in order for the position data it
provides to be of the most use, as well as being available via Rock Seven’s own fleet viewer ‘The Core,’ it must also be available in any software system the user chooses.
Using a standards-based API (Application Programming Interface.), the customer can integrate tracking data from RockFLEET into their own applications. Typically this
means that RockFLEET tracked assets can be added to existing fleet management software, which invariably is designed around an owner or operators own logistics.
With precise vessel location data available, the opportunities are unlimited and only down to the creativity of the user. For instance, a current Rock Seven customer uses
location data to manage payroll of personnel. Essentially, personnel get paid different amounts depending on whether the ship is at sea, in international waters, in port or
transiting regions with high piracy incidents.
The above user is a private security company involved in anti-piracy operations. It actually gets location data using RockSTAR, the handheld version of RockFLEET, which is
a new fixed unit that can be fitted anywhere on board. Completely waterproof and with no moving parts, it is a robust, ultra-compact (13cm diameter/4cm high) device
with multiple mounting options. The physical design of RockFLEET was in part driven by the security challenges faced by vessels facing the issues of modern piracy.
The unit itself is designed to look anonymous; as standard there’s no name on the outside. It works from ship’s power, but it uniquely has a backup battery inside. Which is
important should a vessel be hijacked and the main power cut.
Knowing the location of all friendly vessels in a region is vital to organisations with a stake in ensuring safe passage through known piracy hotspots. With an operational
vessel/fleet tracking system, ship owners and fleet managers will know where their ships are at all times. This information can be fed to authorities, private anti-piracy
companies and the naval forces patrolling piracy hotspots to build a clear, near real-time picture for domain awareness. The value of this information should a vessel be
hijacked is obvious: knowing the last whereabouts of a vessel provides responders with a starting point should a hijacked vessel’s tracking system be disabled by pirates.
Today’s pirates know that many commercial vessels are tracked, especially those would be targets sailing in what are known to be hostile waters. So disabling vessel
tracking equipment on board is a sensible action for said pirates after a hijacked ship’s crew have been subdued and because most tracking units are powered by the
vessel, finding and cutting the power supply isn’t hard. RockFLEET, however, is the only device of its kind with an internal battery backup, so it can continue to transmit
position for up to two weeks if external power is cut.
With facility to mount covertly, this makes it especially suitable for vessels traversing piracy hotspots.
www.tecconcursos.com.br/questoes/465854
More so than many other fields of business, the maritime industry is focused on cost, which in turn gives the appearance of being conservative towards technology.
Certainly, we have technical ships magnificently operating with equipment that wouldn’t look out of place in a NASA lab, but generally, it can take decades for a technology
to become mainstream. Unless it becomes mandated by the IMO (International Maritime Organization). Vessel tracking is a partial exception to the rule though, with many
fleet owners realizing its potential for more cost-effective operation and personnel security.
Knowing the exact position of all vessels in a fleet, in a software solution designed to fit with your own logistical processes, can significantly improve efficiency. If a ship
arrives early or late, more often than not there will be an associated cost. If this can be identified during transit then the early or late arrival can be negated or at least
planned for. Likewise, if by knowing the positions of your fleet of workboats means that you can route the closest vessel to the next job, then significant fuel cost savings
can be made. With modern tracking systems, the way data is used is just as important as knowing where a vessel is at all times. But there are countless ways to apply the
data to the benefit of efficiency for a single ship or fleet. So providing easy and reliable access to position reports is essential.
RockFLEET is an advanced new tracking unit for the professional maritime environment. During its design phase, the team decided that in order for the position data it
provides to be of the most use, as well as being available via Rock Seven’s own fleet viewer ‘The Core,’ it must also be available in any software system the user chooses.
Using a standards-based API (Application Programming Interface.), the customer can integrate tracking data from RockFLEET into their own applications. Typically this
means that RockFLEET tracked assets can be added to existing fleet management software, which invariably is designed around an owner or operators own logistics.
With precise vessel location data available, the opportunities are unlimited and only down to the creativity of the user. For instance, a current Rock Seven customer uses
location data to manage payroll of personnel. Essentially, personnel get paid different amounts depending on whether the ship is at sea, in international waters, in port or
transiting regions with high piracy incidents.
The above user is a private security company involved in anti-piracy operations. It actually gets location data using RockSTAR, the handheld version of RockFLEET, which is
a new fixed unit that can be fitted anywhere on board. Completely waterproof and with no moving parts, it is a robust, ultra-compact (13cm diameter/4cm high) device
with multiple mounting options. The physical design of RockFLEET was in part driven by the security challenges faced by vessels facing the issues of modern piracy.
The unit itself is designed to look anonymous; as standard there’s no name on the outside. It works from ship’s power, but it uniquely has a backup battery inside. Which is
important should a vessel be hijacked and the main power cut.
Knowing the location of all friendly vessels in a region is vital to organisations with a stake in ensuring safe passage through known piracy hotspots. With an operational
vessel/fleet tracking system, ship owners and fleet managers will know where their ships are at all times. This information can be fed to authorities, private anti-piracy
companies and the naval forces patrolling piracy hotspots to build a clear, near real-time picture for domain awareness. The value of this information should a vessel be
hijacked is obvious: knowing the last whereabouts of a vessel provides responders with a starting point should a hijacked vessel’s tracking system be disabled by pirates.
Today’s pirates know that many commercial vessels are tracked, especially those would be targets sailing in what are known to be hostile waters. So disabling vessel
tracking equipment on board is a sensible action for said pirates after a hijacked ship’s crew have been subdued and because most tracking units are powered by the
vessel, finding and cutting the power supply isn’t hard. RockFLEET, however, is the only device of its kind with an internal battery backup, so it can continue to transmit
position for up to two weeks if external power is cut.
With facility to mount covertly, this makes it especially suitable for vessels traversing piracy hotspots.
The fragment in the text “we have technical shipsmagnificently operating with equipment that wouldn’t look out of place in a NASA lab” means that some of the equipment
used on technical ships
a) should be used in a NASA lab.
b) cannot be found in a NASA lab.
c) seem to be appropriate for a NASA lab.
d) would not look suitable for a NASA lab.
e) lack the resources found in equipment in a NASA lab.
www.tecconcursos.com.br/questoes/465875
More so than many other fields of business, the maritime industry is focused on cost, which in turn gives the appearance of being conservative towards technology.
Certainly, we have technical ships magnificently operating with equipment that wouldn’t look out of place in a NASA lab, but generally, it can take decades for a technology
to become mainstream. Unless it becomes mandated by the IMO (International Maritime Organization). Vessel tracking is a partial exception to the rule though, with many
fleet owners realizing its potential for more cost-effective operation and personnel security.
Knowing the exact position of all vessels in a fleet, in a software solution designed to fit with your own logistical processes, can significantly improve efficiency. If a ship
arrives early or late, more often than not there will be an associated cost. If this can be identified during transit then the early or late arrival can be negated or at least
planned for. Likewise, if by knowing the positions of your fleet of workboats means that you can route the closest vessel to the next job, then significant fuel cost savings
can be made. With modern tracking systems, the way data is used is just as important as knowing where a vessel is at all times. But there are countless ways to apply the
data to the benefit of efficiency for a single ship or fleet. So providing easy and reliable access to position reports is essential.
RockFLEET is an advanced new tracking unit for the professional maritime environment. During its design phase, the team decided that in order for the position data it
provides to be of the most use, as well as being available via Rock Seven’s own fleet viewer ‘The Core,’ it must also be available in any software system the user chooses.
With precise vessel location data available, the opportunities are unlimited and only down to the creativity of the user. For instance, a current Rock Seven customer uses
location data to manage payroll of personnel. Essentially, personnel get paid different amounts depending on whether the ship is at sea, in international waters, in port or
transiting regions with high piracy incidents.
The above user is a private security company involved in anti-piracy operations. It actually gets location data using RockSTAR, the handheld version of RockFLEET, which is
a new fixed unit that can be fitted anywhere on board. Completely waterproof and with no moving parts, it is a robust, ultra-compact (13cm diameter/4cm high) device
with multiple mounting options. The physical design of RockFLEET was in part driven by the security challenges faced by vessels facing the issues of modern piracy.
The unit itself is designed to look anonymous; as standard there’s no name on the outside. It works from ship’s power, but it uniquely has a backup battery inside. Which is
important should a vessel be hijacked and the main power cut.
Knowing the location of all friendly vessels in a region is vital to organisations with a stake in ensuring safe passage through known piracy hotspots. With an operational
vessel/fleet tracking system, ship owners and fleet managers will know where their ships are at all times. This information can be fed to authorities, private anti-piracy
companies and the naval forces patrolling piracy hotspots to build a clear, near real-time picture for domain awareness. The value of this information should a vessel be
hijacked is obvious: knowing the last whereabouts of a vessel provides responders with a starting point should a hijacked vessel’s tracking system be disabled by pirates.
Today’s pirates know that many commercial vessels are tracked, especially those would be targets sailing in what are known to be hostile waters. So disabling vessel
tracking equipment on board is a sensible action for said pirates after a hijacked ship’s crew have been subdued and because most tracking units are powered by the
vessel, finding and cutting the power supply isn’t hard. RockFLEET, however, is the only device of its kind with an internal battery backup, so it can continue to transmit
position for up to two weeks if external power is cut.
With facility to mount covertly, this makes it especially suitable for vessels traversing piracy hotspots.
According to the text, vessel tracking systems can be used to provide all the benefits below, EXCEPT
a) identifying the location of all pirate vessels in hostile waters.
b) knowing the accurate position of all vessels in a fleet.
c) using location data to manage payroll of personnel.
d) enhancing the efficiency of a single ship or fleet.
e) allowing significant savings in fuel costs.
www.tecconcursos.com.br/questoes/266084
by Holly Johnson
Cheap, easy credit might have been tempting to young people in the past, but not to today’s millennials. According to a recent survey by Bankrate of over 1,161
consumers, 63% of adults ages 18 to 29 live without a credit card of any kind, and another 23% only carry one card.
Research shows that the environment millennials grew up in might have an impact on their finances. Unlike other generations, millennials lived through economic
hardships during a time when their adult lives were beginning. According to the Bureau of Labor Statistics, the Great Recession caused millennials to stray from historic
patterns when it comes to purchasing a home and having children, and a fear of credit cards could be another symptom of the economic environment of the times.
And there’s much data when it comes to proving that millennials grew up on shaky economic ground. The Pew Research Center reports that 36% of millennials lived at
home with their parents in 2012. Meanwhile, the unemployment rate for people ages 16 to 24 was 14.2% (more than twice the national rate) in early 2014, according to
the BLS. With those figures, it’s no wonder that millennials are skittish when it comes to credit cards. It makes sense that young people would be afraid to take on any new
forms of debt.
But the Great Recession isn’t the only reason millennials could be fearful of credit. Many experts believe that the nation’s student loan debt level might be related to it.
According to the Institute for College Access & Success, 71% of millennials (or 1.3 million students) who graduated from college in 2012 left school with at least some
student loan debt, with the average amount owed around $29,400.
With so much debt already under their belts, millennials are worried about adding any credit card debt to the pile. After all, many adults with student loan debt need to
make payments for years, and even decades.
The fact that millennials are smart enough to avoid credit card debt is a good thing, but that doesn’t mean the decision has its drawbacks. According to Experian, most
adults need a positive credit history in order to qualify for an auto loan or mortgage. Even
worse, having no credit history is almost as bad as having a negative credit history in some cases.
Still, there are plenty of ways millennials can build a credit history without a credit card. A few tips:
Make payments on installment loans on time. Whether it’s a car loan, student loan or personal loan, make sure to mail in those payments on time and pay at least
the minimum amount required.
Put at least one household or utility bill in your name. Paying your utility or household bills on time can help you build a positive credit history.
Get a secured credit card. Unlike traditional credit cards, the funds secured credit cards offer are backed by money the user deposits. Signing up for a secured card
is one way to build a positive credit history without any risk.
www.tecconcursos.com.br/questoes/266087
by Holly Johnson
Cheap, easy credit might have been tempting to young people in the past, but not to today’s millennials. According to a recent survey by Bankrate of over 1,161
consumers, 63% of adults ages 18 to 29 live without a credit card of any kind, and another 23% only carry one card.
Research shows that the environment millennials grew up in might have an impact on their finances. Unlike other generations, millennials lived through economic
hardships during a time when their adult lives were beginning. According to the Bureau of Labor Statistics, the Great Recession caused millennials to stray from historic
patterns when it comes to purchasing a home and having children, and a fear of credit cards could be another symptom of the economic environment of the times.
And there’s much data when it comes to proving that millennials grew up on shaky economic ground. The Pew Research Center reports that 36% of millennials lived at
home with their parents in 2012. Meanwhile, the unemployment rate for people ages 16 to 24 was 14.2% (more than twice the national rate) in early 2014, according to
the BLS. With those figures, it’s no wonder that millennials are skittish when it comes to credit cards. It makes sense that young people would be afraid to take on any new
forms of debt.
But the Great Recession isn’t the only reason millennials could be fearful of credit. Many experts believe that the nation’s student loan debt level might be related to it.
According to the Institute for College Access & Success, 71% of millennials (or 1.3 million students) who graduated from college in 2012 left school with at least some
student loan debt, with the average amount owed around $29,400.
With so much debt already under their belts, millennials are worried about adding any credit card debt to the pile. After all, many adults with student loan debt need to
make payments for years, and even decades.
The fact that millennials are smart enough to avoid credit card debt is a good thing, but that doesn’t mean the decision has its drawbacks. According to Experian, most
adults need a positive credit history in order to qualify for an auto loan or mortgage. Even
worse, having no credit history is almost as bad as having a negative credit history in some cases.
Still, there are plenty of ways millennials can build a credit history without a credit card. A few tips:
Make payments on installment loans on time. Whether it’s a car loan, student loan or personal loan, make sure to mail in those payments on time and pay at least
the minimum amount required.
Put at least one household or utility bill in your name. Paying your utility or household bills on time can help you build a positive credit history.
Get a secured credit card. Unlike traditional credit cards, the funds secured credit cards offer are backed by money the user deposits. Signing up for a secured card
is one way to build a positive credit history without any risk.
The fact that millennials are leery of credit cards is probably a good thing in the long run. After all, not having a credit card is the perfect way to stay out of credit card
debt. Even though it might be harder to build a credit history without credit cards, the vast majority of millennials have decided that the plastic just isn’t worth it.
The sentence of the text “With so much debt already under their belts, millennials are worried about adding any credit card debt to the pile” conveys the idea that
millenials have
a) piles of bills to pay every month, but they can use their credit cards moderately.
b) so many bills to pay that credit card bills wouldn’t make much difference.
c) so many bills to pay that they have to sell their belongings.
d) so much debt to pay that they can’t afford another one.
e) no credit cards simply because they don’t like them.
www.tecconcursos.com.br/questoes/304203
People have virtually unlimited needs, but the economic resources to supply those needs are limited. Therefore, the greatest benefit of an economy is to provide the most
desirable consumer goods and services in the most desirable amounts - what is known as the efficient allocation of economic resources. To produce these consumer goods
and services requires capital in the form of labor, land, capital goods used to produce a desired product or service, and entrepreneurial ability to use these resources
The financial system of an economy provides the means to collect money from the people who have it and distribute it to those who can use it best. Hence, the efficient
allocation of economic resources is achieved by a financial system that allocates money to those people and for those purposes that will yield the greatest return.
The financial system is composed of the products and services provided by financial institutions, which include banks, insurance companies, pension funds, organized
exchanges, and the many other companies that serve to facilitate economic transactions. Virtually all economic transactions are effected by one or more of these financial
institutions. They create financial instruments, such as stocks and bonds, pay interest on deposits, lend money to creditworthy borrowers, and create and maintain the
payment systems of modern economies.
These financial products and services are basedon the following fundamental objectives of any modern financial system:
Available at: <http://thismatter.com/money/banking/ financial-system.htm>. Retrieved on: July 27th, 2015. Adapted.
From the sentence of the text “The financial system of an economy provides the means to collect money from the people who have it and distribute it to those who can use
it best”, it can be inferred that people who
a) can use the money most efficiently are those who have much money.
b) operate the financial system of an economy collect and distribute money the best way.
c) receive the distributed money don’t know how to use it best.
d) have much money and know how to use it best are the same.
e) operate the financial system of an economy collect themoney and keep it.
www.tecconcursos.com.br/questoes/601490
Paul Stenquist
Cars and trucks powered by natural gas make up a significant portion of the vehicle fleet in many parts of the world. Iran has more than two million natural gas vehicles on
the road. As of 2009, Argentina had more than 1.8 million in operation and almost 2,000 natural gas filling stations. Brazil was not far behind. Italy and Germany have
substantial natural gas vehicle fleets. Is America next?
With natural gas in plentiful supply at bargain prices in the United States, issues that have limited its use in cars are being rethought, and its market share could increase,
perhaps substantially.
According to Energy Department Price Information from July, natural gas offers economic advantages over gasoline and diesel fuels. If a gasoline-engine vehicle can take
you 40 miles on one gallon, the same vehicle running on compressed natural gas can do it for about $1.50 less at today’s prices. To that savings add lower maintenance
costs. A study of New York City cabs running on natural gas found that oil changes need not be as frequent because of the clean burn of the fuel, and exhaustsystem parts
last longer because natural gas is less corrosive than other fuels.
Today, those economic benefits are nullified by the initial cost of a natural gas vehicle — 20 to 30 percent more than a comparable gasoline-engine vehicle. But were
production to increase significantly, economies of scale would bring prices down. In an interview by phone, Jon Coleman, fleet sustainability manager at the Ford Motor
Company, said that given sufficient volume, the selling price of natural gas vehicles could be comparable to that of conventional vehicles.
It may be years before the economic benefits of natural gas vehicles can be realized, but the environmental benefits appear to be immediate. According to the Energy
Department’s website, natural gas vehicles have smaller carbon footprints than gasoline or diesel automobiles, even when taking into account the natural gas production
process, which releases carbon-rich methane into the atmosphere.
The United States government appears to favor natural gas as a motor vehicle fuel. To promote the production of vehicles with fewer carbon emissions, it has allowed
automakers to count certain vehicle types more than once when calculating their Corporate Average Fuel Economy, under regulations mandating a fleet average of 54.5
miles per gallon by 2025. Plug-in hybrids and natural gas vehicles can be counted 1.6 times under the CAFE standards, and electric vehicles can be counted twice.
Adapting natural gas as a vehicle fuel introduces engineering challenges. While the fuel burns clean, it is less energy dense than gasoline, so if it is burned in an engine
designed to run on conventional fuel, performance and efficiency are degraded.
But since natural gas has an octane rating of 130, compared with 93 for the best gasoline, an engine designed for it can run with very high cylinder pressure, which would
cause a regular gasoline engine to knock from premature ignition. More cylinder pressure yields more power, and thus the energy-density advantage of gasoline can be
nullified.[...]
Until the pressurized fuel tanks of natural gas vehicles can be easily and quickly refueled, the fleet cannot grow substantially. The number of commercial refueling stations
for compressed natural gas has been increasing at a rate of 16 percent yearly, the Energy Department says. And, while the total is still small, advances in refueling
equipment should increase the rate of expansion. Much of the infrastructure is already in place: America has millions of miles of natural gas pipeline. Connecting that
network to refueling equipment is not difficult.
Although commercial refueling stations will be necessary to support a substantial fleet of natural gas vehicles, home refueling may be the magic bullet that makes the
vehicles practical. Electric vehicles depend largely on home charging and most have less than half the range of a fully fueled natural gas vehicle. Somecompressed natural
gas home refueling products are available, but they can cost as much as $5,000.
Seeking to change that, the Energy Department has awarded grants to a number of companies in an effort to develop affordable home-refueling equipment.
[...]
www.tecconcursos.com.br/questoes/601492
Paul Stenquist
Cars and trucks powered by natural gas make up a significant portion of the vehicle fleet in many parts of the world. Iran has more than two million natural gas vehicles on
the road. As of 2009, Argentina had more than 1.8 million in operation and almost 2,000 natural gas filling stations. Brazil was not far behind. Italy and Germany have
substantial natural gas vehicle fleets. Is America next?
With natural gas in plentiful supply at bargain prices in the United States, issues that have limited its use in cars are being rethought, and its market share could increase,
perhaps substantially.
According to Energy Department Price Information from July, natural gas offers economic advantages over gasoline and diesel fuels. If a gasoline-engine vehicle can take
you 40 miles on one gallon, the same vehicle running on compressed natural gas can do it for about $1.50 less at today’s prices. To that savings add lower maintenance
costs. A study of New York City cabs running on natural gas found that oil changes need not be as frequent because of the clean burn of the fuel, and exhaustsystem parts
last longer because natural gas is less corrosive than other fuels.
Today, those economic benefits are nullified by the initial cost of a natural gas vehicle — 20 to 30 percent more than a comparable gasoline-engine vehicle. But were
production to increase significantly, economies of scale would bring prices down. In an interview by phone, Jon Coleman, fleet sustainability manager at the Ford Motor
Company, said that given sufficient volume, the selling price of natural gas vehicles could be comparable to that of conventional vehicles.
It may be years before the economic benefits of natural gas vehicles can be realized, but the environmental benefits appear to be immediate. According to the Energy
Department’s website, natural gas vehicles have smaller carbon footprints than gasoline or diesel automobiles, even when taking into account the natural gas production
process, which releases carbon-rich methane into the atmosphere.
The United States government appears to favor natural gas as a motor vehicle fuel. To promote the production of vehicles with fewer carbon emissions, it has allowed
automakers to count certain vehicle types more than once when calculating their Corporate Average Fuel Economy, under regulations mandating a fleet average of 54.5
miles per gallon by 2025. Plug-in hybrids and natural gas vehicles can be counted 1.6 times under the CAFE standards, and electric vehicles can be counted twice.
Adapting natural gas as a vehicle fuel introduces engineering challenges. While the fuel burns clean, it is less energy dense than gasoline, so if it is burned in an engine
designed to run on conventional fuel, performance and efficiency are degraded.
But since natural gas has an octane rating of 130, compared with 93 for the best gasoline, an engine designed for it can run with very high cylinder pressure, which would
cause a regular gasoline engine to knock from premature ignition. More cylinder pressure yields more power, and thus the energy-density advantage of gasoline can be
nullified.[...]
Until the pressurized fuel tanks of natural gas vehicles can be easily and quickly refueled, the fleet cannot grow substantially. The number of commercial refueling stations
for compressed natural gas has been increasing at a rate of 16 percent yearly, the Energy Department says. And, while the total is still small, advances in refueling
equipment should increase the rate of expansion. Much of the infrastructure is already in place: America has millions of miles of natural gas pipeline. Connecting that
network to refueling equipment is not difficult.
Although commercial refueling stations will be necessary to support a substantial fleet of natural gas vehicles, home refueling may be the magic bullet that makes the
vehicles practical. Electric vehicles depend largely on home charging and most have less than half the range of a fully fueled natural gas vehicle. Somecompressed natural
gas home refueling products are available, but they can cost as much as $5,000.
Seeking to change that, the Energy Department has awarded grants to a number of companies in an effort to develop affordable home-refueling equipment.
[...]
According to the paragraph limited by lines (destaque) in the text, one can infer that
a) gasoline is as expensive as diesel in New York City.
b) a car running on natural gas will pay $1.50 on one gallon of the fuel.
c) every car running on natural gas will afford to save $3.00 on a 60-mile drive.
d) the cost of oil changes can improve savings in natural gas-fueled vehicles.
e) natural gas cannot be associated with corrosion in car’s exhaust-system parts.
www.tecconcursos.com.br/questoes/601493
Paul Stenquist
Cars and trucks powered by natural gas make up a significant portion of the vehicle fleet in many parts of the world. Iran has more than two million natural gas vehicles on
the road. As of 2009, Argentina had more than 1.8 million in operation and almost 2,000 natural gas filling stations. Brazil was not far behind. Italy and Germany have
substantial natural gas vehicle fleets. Is America next?
With natural gas in plentiful supply at bargain prices in the United States, issues that have limited its use in cars are being rethought, and its market share could increase,
perhaps substantially.
According to Energy Department Price Information from July, natural gas offers economic advantages over gasoline and diesel fuels. If a gasoline-engine vehicle can take
you 40 miles on one gallon, the same vehicle running on compressed natural gas can do it for about $1.50 less at today’s prices. To that savings add lower maintenance
costs. A study of New York City cabs running on natural gas found that oil changes need not be as frequent because of the clean burn of the fuel, and exhaustsystem parts
last longer because natural gas is less corrosive than other fuels.
Today, those economic benefits are nullified by the initial cost of a natural gas vehicle — 20 to 30 percent more than a comparable gasoline-engine vehicle. But were
production to increase significantly, economies of scale would bring prices down. In an interview by phone, Jon Coleman, fleet sustainability manager at the Ford Motor
Company, said that given sufficient volume, the selling price of natural gas vehicles could be comparable to that of conventional vehicles.
It may be years before the economic benefits of natural gas vehicles can be realized, but the environmental benefits appear to be immediate. According to the Energy
Department’s website, natural gas vehicles have smaller carbon footprints than gasoline or diesel automobiles, even when taking into account the natural gas production
process, which releases carbon-rich methane into the atmosphere.
The United States government appears to favor natural gas as a motor vehicle fuel. To promote the production of vehicles with fewer carbon emissions, it has allowed
automakers to count certain vehicle types more than once when calculating their Corporate Average Fuel Economy, under regulations mandating a fleet average of 54.5
miles per gallon by 2025. Plug-in hybrids and natural gas vehicles can be counted 1.6 times under the CAFE standards, and electric vehicles can be counted twice.
Adapting natural gas as a vehicle fuel introduces engineering challenges. While the fuel burns clean, it is less energy dense than gasoline, so if it is burned in an engine
designed to run on conventional fuel, performance and efficiency are degraded.
But since natural gas has an octane rating of 130, compared with 93 for the best gasoline, an engine designed for it can run with very high cylinder pressure, which would
cause a regular gasoline engine to knock from premature ignition. More cylinder pressure yields more power, and thus the energy-density advantage of gasoline can be
nullified.[...]
Until the pressurized fuel tanks of natural gas vehicles can be easily and quickly refueled, the fleet cannot grow substantially. The number of commercial refueling stations
for compressed natural gas has been increasing at a rate of 16 percent yearly, the Energy Department says. And, while the total is still small, advances in refueling
equipment should increase the rate of expansion. Much of the infrastructure is already in place: America has millions of miles of natural gas pipeline. Connecting that
network to refueling equipment is not difficult.
Although commercial refueling stations will be necessary to support a substantial fleet of natural gas vehicles, home refueling may be the magic bullet that makes the
vehicles practical. Electric vehicles depend largely on home charging and most have less than half the range of a fully fueled natural gas vehicle. Somecompressed natural
gas home refueling products are available, but they can cost as much as $5,000.
Seeking to change that, the Energy Department has awarded grants to a number of companies in an effort to develop affordable home-refueling equipment.
[...]
The sentence of the text “But were production to increase significantly, economies of scale would bring prices down” has the same meaning as:
a) Economies of scale would reduce production and prices significantly.
b) Economies of scale would be one of the conditions for the decrease of prices.
c) Production would increase unless economies of scale brought prices down.
d) Production would increase significantly if economies of scale didn’t bring the prices down.
e) Prices would not go down although the production increased.
www.tecconcursos.com.br/questoes/601494
Paul Stenquist
Cars and trucks powered by natural gas make up a significant portion of the vehicle fleet in many parts of the world. Iran has more than two million natural gas vehicles on
the road. As of 2009, Argentina had more than 1.8 million in operation and almost 2,000 natural gas filling stations. Brazil was not far behind. Italy and Germany have
substantial natural gas vehicle fleets. Is America next?
With natural gas in plentiful supply at bargain prices in the United States, issues that have limited its use in cars are being rethought, and its market share could increase,
perhaps substantially.
According to Energy Department Price Information from July, natural gas offers economic advantages over gasoline and diesel fuels. If a gasoline-engine vehicle can take
you 40 miles on one gallon, the same vehicle running on compressed natural gas can do it for about $1.50 less at today’s prices. To that savings add lower maintenance
costs. A study of New York City cabs running on natural gas found that oil changes need not be as frequent because of the clean burn of the fuel, and exhaustsystem parts
last longer because natural gas is less corrosive than other fuels.
Today, those economic benefits are nullified by the initial cost of a natural gas vehicle — 20 to 30 percent more than a comparable gasoline-engine vehicle. But were
production to increase significantly, economies of scale would bring prices down. In an interview by phone, Jon Coleman, fleet sustainability manager at the Ford Motor
Company, said that given sufficient volume, the selling price of natural gas vehicles could be comparable to that of conventional vehicles.
It may be years before the economic benefits of natural gas vehicles can be realized, but the environmental benefits appear to be immediate. According to the Energy
Department’s website, natural gas vehicles have smaller carbon footprints than gasoline or diesel automobiles, even when taking into account the natural gas production
process, which releases carbon-rich methane into the atmosphere.
The United States government appears to favor natural gas as a motor vehicle fuel. To promote the production of vehicles with fewer carbon emissions, it has allowed
automakers to count certain vehicle types more than once when calculating their Corporate Average Fuel Economy, under regulations mandating a fleet average of 54.5
miles per gallon by 2025. Plug-in hybrids and natural gas vehicles can be counted 1.6 times under the CAFE standards, and electric vehicles can be counted twice.
But since natural gas has an octane rating of 130, compared with 93 for the best gasoline, an engine designed for it can run with very high cylinder pressure, which would
cause a regular gasoline engine to knock from premature ignition. More cylinder pressure yields more power, and thus the energy-density advantage of gasoline can be
nullified.[...]
Until the pressurized fuel tanks of natural gas vehicles can be easily and quickly refueled, the fleet cannot grow substantially. The number of commercial refueling stations
for compressed natural gas has been increasing at a rate of 16 percent yearly, the Energy Department says. And, while the total is still small, advances in refueling
equipment should increase the rate of expansion. Much of the infrastructure is already in place: America has millions of miles of natural gas pipeline. Connecting that
network to refueling equipment is not difficult.
Although commercial refueling stations will be necessary to support a substantial fleet of natural gas vehicles, home refueling may be the magic bullet that makes the
vehicles practical. Electric vehicles depend largely on home charging and most have less than half the range of a fully fueled natural gas vehicle. Somecompressed natural
gas home refueling products are available, but they can cost as much as $5,000.
Seeking to change that, the Energy Department has awarded grants to a number of companies in an effort to develop affordable home-refueling equipment.
[...]
In the 5th paragraph, limited by lines 35-42 in the text, the author defends the idea that
a) economic and environmental benefits of natural gas vehicles are both immediate results of smaller footprints than those of gasoline or diesel automobiles.
b) economic benefits of natural gas vehicles are not as considerable as the environmental benefits because of the cost of the natural gas production process.
c) natural gas vehicles produce smaller footprints than those of gasoline or diesel automobiles because they bring more environmental benefits.
d) environmental benefits of natural gas vehicles are remarkable despite the carbon-rich methane released into the atmosphere in the production process.
e) environmental benefits of natural gas vehicles are not as considerable as the economic benefits because of the cost of the carbon-rich methane released into the
atmosphere in the production process.
www.tecconcursos.com.br/questoes/601496
Paul Stenquist
Cars and trucks powered by natural gas make up a significant portion of the vehicle fleet in many parts of the world. Iran has more than two million natural gas vehicles on
the road. As of 2009, Argentina had more than 1.8 million in operation and almost 2,000 natural gas filling stations. Brazil was not far behind. Italy and Germany have
substantial natural gas vehicle fleets. Is America next?
With natural gas in plentiful supply at bargain prices in the United States, issues that have limited its use in cars are being rethought, and its market share could increase,
perhaps substantially.
According to Energy Department Price Information from July, natural gas offers economic advantages over gasoline and diesel fuels. If a gasoline-engine vehicle can take
you 40 miles on one gallon, the same vehicle running on compressed natural gas can do it for about $1.50 less at today’s prices. To that savings add lower maintenance
costs. A study of New York City cabs running on natural gas found that oil changes need not be as frequent because of the clean burn of the fuel, and exhaustsystem parts
last longer because natural gas is less corrosive than other fuels.
Today, those economic benefits are nullified by the initial cost of a natural gas vehicle — 20 to 30 percent more than a comparable gasoline-engine vehicle. But were
production to increase significantly, economies of scale would bring prices down. In an interview by phone, Jon Coleman, fleet sustainability manager at the Ford Motor
Company, said that given sufficient volume, the selling price of natural gas vehicles could be comparable to that of conventional vehicles.
It may be years before the economic benefits of natural gas vehicles can be realized, but the environmental benefits appear to be immediate. According to the Energy
Department’s website, natural gas vehicles have smaller carbon footprints than gasoline or diesel automobiles, even when taking into account the natural gas production
process, which releases carbon-rich methane into the atmosphere.
The United States government appears to favor natural gas as a motor vehicle fuel. To promote the production of vehicles with fewer carbon emissions, it has allowed
automakers to count certain vehicle types more than once when calculating their Corporate Average Fuel Economy, under regulations mandating a fleet average of 54.5
miles per gallon by 2025. Plug-in hybrids and natural gas vehicles can be counted 1.6 times under the CAFE standards, and electric vehicles can be counted twice.
Adapting natural gas as a vehicle fuel introduces engineering challenges. While the fuel burns clean, it is less energy dense than gasoline, so if it is burned in an engine
designed to run on conventional fuel, performance and efficiency are degraded.
But since natural gas has an octane rating of 130, compared with 93 for the best gasoline, an engine designed for it can run with very high cylinder pressure, which would
cause a regular gasoline engine to knock from premature ignition. More cylinder pressure yields more power, and thus the energy-density advantage of gasoline can be
nullified.[...]
Until the pressurized fuel tanks of natural gas vehicles can be easily and quickly refueled, the fleet cannot grow substantially. The number of commercial refueling stations
for compressed natural gas has been increasing at a rate of 16 percent yearly, the Energy Department says. And, while the total is still small, advances in refueling
equipment should increase the rate of expansion. Much of the infrastructure is already in place: America has millions of miles of natural gas pipeline. Connecting that
network to refueling equipment is not difficult.
Although commercial refueling stations will be necessary to support a substantial fleet of natural gas vehicles, home refueling may be the magic bullet that makes the
vehicles practical. Electric vehicles depend largely on home charging and most have less than half the range of a fully fueled natural gas vehicle. Somecompressed natural
gas home refueling products are available, but they can cost as much as $5,000.
Seeking to change that, the Energy Department has awarded grants to a number of companies in an effort to develop affordable home-refueling equipment.
[...]
According to the 6th paragraph in the text (lines 43-52), one of the Corporate Average Fuel Economy goals for the fleet in the United States is average 54.5 miles per
gallon
a) in 2025
b) prior 2025
c) around 2025
d) sometime before 2025
e) not later than 2025
www.tecconcursos.com.br/questoes/601498
Paul Stenquist
Cars and trucks powered by natural gas make up a significant portion of the vehicle fleet in many parts of the world. Iran has more than two million natural gas vehicles on
the road. As of 2009, Argentina had more than 1.8 million in operation and almost 2,000 natural gas filling stations. Brazil was not far behind. Italy and Germany have
substantial natural gas vehicle fleets. Is America next?
With natural gas in plentiful supply at bargain prices in the United States, issues that have limited its use in cars are being rethought, and its market share could increase,
perhaps substantially.
According to Energy Department Price Information from July, natural gas offers economic advantages over gasoline and diesel fuels. If a gasoline-engine vehicle can take
you 40 miles on one gallon, the same vehicle running on compressed natural gas can do it for about $1.50 less at today’s prices. To that savings add lower maintenance
costs. A study of New York City cabs running on natural gas found that oil changes need not be as frequent because of the clean burn of the fuel, and exhaustsystem parts
last longer because natural gas is less corrosive than other fuels.
Today, those economic benefits are nullified by the initial cost of a natural gas vehicle — 20 to 30 percent more than a comparable gasoline-engine vehicle. But were
production to increase significantly, economies of scale would bring prices down. In an interview by phone, Jon Coleman, fleet sustainability manager at the Ford Motor
Company, said that given sufficient volume, the selling price of natural gas vehicles could be comparable to that of conventional vehicles.
It may be years before the economic benefits of natural gas vehicles can be realized, but the environmental benefits appear to be immediate. According to the Energy
Department’s website, natural gas vehicles have smaller carbon footprints than gasoline or diesel automobiles, even when taking into account the natural gas production
process, which releases carbon-rich methane into the atmosphere.
The United States government appears to favor natural gas as a motor vehicle fuel. To promote the production of vehicles with fewer carbon emissions, it has allowed
automakers to count certain vehicle types more than once when calculating their Corporate Average Fuel Economy, under regulations mandating a fleet average of 54.5
miles per gallon by 2025. Plug-in hybrids and natural gas vehicles can be counted 1.6 times under the CAFE standards, and electric vehicles can be counted twice.
Adapting natural gas as a vehicle fuel introduces engineering challenges. While the fuel burns clean, it is less energy dense than gasoline, so if it is burned in an engine
designed to run on conventional fuel, performance and efficiency are degraded.
But since natural gas has an octane rating of 130, compared with 93 for the best gasoline, an engine designed for it can run with very high cylinder pressure, which would
cause a regular gasoline engine to knock from premature ignition. More cylinder pressure yields more power, and thus the energy-density advantage of gasoline can be
nullified.[...]
Until the pressurized fuel tanks of natural gas vehicles can be easily and quickly refueled, the fleet cannot grow substantially. The number of commercial refueling stations
for compressed natural gas has been increasing at a rate of 16 percent yearly, the Energy Department says. And, while the total is still small, advances in refueling
equipment should increase the rate of expansion. Much of the infrastructure is already in place: America has millions of miles of natural gas pipeline. Connecting that
network to refueling equipment is not difficult.
Although commercial refueling stations will be necessary to support a substantial fleet of natural gas vehicles, home refueling may be the magic bullet that makes the
vehicles practical. Electric vehicles depend largely on home charging and most have less than half the range of a fully fueled natural gas vehicle. Somecompressed natural
gas home refueling products are available, but they can cost as much as $5,000.
Seeking to change that, the Energy Department has awarded grants to a number of companies in an effort to develop affordable home-refueling equipment.
[...]
According to the 9th paragraph in the text (lines 65-75), refueling stations in the United States
www.tecconcursos.com.br/questoes/199841
In a previous blog post I wrote that one of the best ways to motivate people is to stimulate a desire for mastery – and that breaking things into small pieces and showing
progress through the pieces encourages the desire for mastery. Another tip for stimulating the desire for mastery is to give people autonomy. When people feel that they
have some control over what they are doing and how they do it, then their desire for mastery increases. They will then be motivated to continue and keep learning. If
people feel that they don’t have any control or autonomy, then they lose the desire to learn and do more – they may lose the desire to master whatever task you are asking
them to do. Here’s an example: Let’s say that you have created a language learning app. The desire for mastery will be automatically in play if the person wants to learn a
language. However, if you want people to continue using the app, and use it frequently and often, then you have to do more than just present lessons in the app. One way
to further stimulate the desire for mastery, is to give them some control over how they use the app. You can provide different types of exercises and interactions, such as
listening, writing, or speaking the language, and let them choose which exercises and activities they need or want, and in what order to do them. If they feel they have
control over how quickly they go through the lessons, which ones they repeat, which activities to engage in, and in what order, then they will be more motivated to keep
learning. What do you think? Have you tried giving autonomy to keep people motivated?
Available at: <http://www.psychologytoday.com/blog/ brain-wise/201310/give-people-autonomy>. Retrieved on: Oct. 15th 2013. Adapted
www.tecconcursos.com.br/questoes/199843
In a previous blog post I wrote that one of the best ways to motivate people is to stimulate a desire for mastery – and that breaking things into small pieces and showing
progress through the pieces encourages the desire for mastery. Another tip for stimulating the desire for mastery is to give people autonomy. When people feel that they
have some control over what they are doing and how they do it, then their desire for mastery increases. They will then be motivated to continue and keep learning. If
people feel that they don’t have any control or autonomy, then they lose the desire to learn and do more – they may lose the desire to master whatever task you are asking
them to do. Here’s an example: Let’s say that you have created a language learning app. The desire for mastery will be automatically in play if the person wants to learn a
language. However, if you want people to continue using the app, and use it frequently and often, then you have to do more than just present lessons in the app. One way
to further stimulate the desire for mastery, is to give them some control over how they use the app. You can provide different types of exercises and interactions, such as
listening, writing, or speaking the language, and let them choose which exercises and activities they need or want, and in what order to do them. If they feel they have
control over how quickly they go through the lessons, which ones they repeat, which activities to engage in, and in what order, then they will be more motivated to keep
learning. What do you think? Have you tried giving autonomy to keep people motivated?
Available at: <http://www.psychologytoday.com/blog/ brain-wise/201310/give-people-autonomy>. Retrieved on: Oct. 15th 2013. Adapted
The expression of the text “another tip” suggests that the author
a) presented a tip before.
b) presented two tips before.
c) has never presented any tip.
d) presents a tip by the first time.
e) wrote about suspending people’s autonomy.
www.tecconcursos.com.br/questoes/199846
In a previous blog post I wrote that one of the best ways to motivate people is to stimulate a desire for mastery – and that breaking things into small pieces and showing
progress through the pieces encourages the desire for mastery. Another tip for stimulating the desire for mastery is to give people autonomy. When people feel that they
have some control over what they are doing and how they do it, then their desire for mastery increases. They will then be motivated to continue and keep learning. If
people feel that they don’t have any control or autonomy, then they lose the desire to learn and do more – they may lose the desire to master whatever task you are asking
them to do. Here’s an example: Let’s say that you have created a language learning app. The desire for mastery will be automatically in play if the person wants to learn a
language. However, if you want people to continue using the app, and use it frequently and often, then you have to do more than just present lessons in the app. One way
to further stimulate the desire for mastery, is to give them some control over how they use the app. You can provide different types of exercises and interactions, such as
listening, writing, or speaking the language, and let them choose which exercises and activities they need or want, and in what order to do them. If they feel they have
control over how quickly they go through the lessons, which ones they repeat, which activities to engage in, and in what order, then they will be more motivated to keep
learning. What do you think? Have you tried giving autonomy to keep people motivated?
Available at: <http://www.psychologytoday.com/blog/ brain-wise/201310/give-people-autonomy>. Retrieved on: Oct. 15th 2013. Adapted
In the fragments of the text: “they may lose the desire to master whatever task you are asking them to do” and “then you have to do more than just present lessons in
the app”, the verb forms in bold express the ideas, respectively, of
www.tecconcursos.com.br/questoes/213697
So you’re thinking about a field job in the oil industry. If you haven’t been involved in the oil patch before, you probably have no idea how vast it is, or where to start your
job search. Many sites will try to convince you that you can get a job on an offshore rig making $10,000 a month without any experience or training at all, and while this is
possible, it’s not at all likely. Actually, it can be tough to find a job in any field of the oil industry without some experience or training.
First, you should realize that the oil industry isn’t just drilling rigs, pumpjacks, and gas stations. The oil industry is a lot like the military in that it employs people in nearly
every profession. There are positions such as roughneck or airgun operator, that are very specific to the oil industry; but there are also welders, medics, chemists,
biologists, environmentalists, cooks, computer programmers, engineers, and a thousand more positions that are absolutely essential to the industry. You don’t have to have
experience specifically in the oil industry in order to have relevant experience.
The oil patch is a little bit different from most other industries. You’ll soon lose the idea of a weekend as you now know it... The patch runs seven days a week, and in
many cases, 24 hours a day. You’ll be expected to work every day in all weather conditions, for weeks or even months at a time. The oil industry is also very production
oriented; you’ll make more money welding in the oil patch than in another industry, but you’ll work longer and harder for that bigger paycheck.
There are a few prerequisites if you want a field job in the oil patch:
You must be in reasonably good physical condition, and be able to lift at least 50 lbs. regularly.
For most positions, you must have a valid driver’s license.
You must have suitable clothing for extended outdoor work and in most cases, hard toed safety boots.
You should not have any medical condition which would make it unsafe for you to operate machinery.
You don’t need to live in the city where your employer is located, but in most cases you will have to provide your own transportation to and from your home from
the employer’s location (point-of-hire). If you live a long way from any area with oil and gas activity, you will have a very difficult time finding an entry level job in
this industry.
You must be willing and able to work hard for long hours. This industry is all about production, and if you don’t produce, you’re not an asset to the company.
You must be drug-free. Most companies conduct pre-employment drug screenings and random testing of employees. If your test show signs of illegal drugs in your
system, you will not be hired. Most oil work requires you to live away from home, in motels or camps near the jobs. Your travel, accommodations, and meals will
usually be paid by your employer while you’re working. Most companies also provide all required safety supplies, such as hard hats and reflective safety vests. You
are required to supply your own work clothes, boots, gloves, etc.
Before you leave for your first job, be sure you have appropriate clothing to spend 14 hours outside... frostbite isn’t fun, neither is heat stroke.
Much of the work in the oil industry is very physically demanding, especially in the entry level positions. There is no upper age limit, but you should be willing and able to
work hard for long hours, lift 50 lbs regularly, and be in relatively good physical condition. If you have back or other health problems that prevent strenuous activity, you
may want to reconsider this line of work. Most companies require employees to
be at least 18 years old. A recent hearing test and/or medical evaluation may be required.
Many oilfield companies also require a preemployment drug and alcohol screening. You should know that though you can make a lot of money in a month in the oil patch,
you can also make no money in a month. Most oilfield work isn’t very stable, and you’ll occasionally find yourself laid-off on short notice due to a shortage of work... and
called back on even shorter notice. Many people in Canada work in the oil
industry during the winter while it’s busy, then take the spring and summer off, or work non-oilfield summer jobs.
Offshore and overseas rigs usually operate yearround, offering a much more stable work environment; but there are very few positions on these rigs that are available
without any experience. If you’re interested in working on one of these rigs, you may want to start with a catering job. All major offshore and overseas projects employ
catering staff to provide meals for the rig crew. These positions are often available without experience, and rig managers will often hire catering staff onto the rig crew if
they need an extra hand, or if a member of the rig crew gets injured or leaves. It’s a matter of being in the right place at the right time, and showing interest in working
on the rig.
Available at: <http://www.oilfi eldworkers.com/oilfi eldintro.php> Retrieved on: Aug. 29, 2012.
www.tecconcursos.com.br/questoes/213702
So you’re thinking about a field job in the oil industry. If you haven’t been involved in the oil patch before, you probably have no idea how vast it is, or where to start your
job search. Many sites will try to convince you that you can get a job on an offshore rig making $10,000 a month without any experience or training at all, and while this is
possible, it’s not at all likely. Actually, it can be tough to find a job in any field of the oil industry without some experience or training.
First, you should realize that the oil industry isn’t just drilling rigs, pumpjacks, and gas stations. The oil industry is a lot like the military in that it employs people in nearly
every profession. There are positions such as roughneck or airgun operator, that are very specific to the oil industry; but there are also welders, medics, chemists,
biologists, environmentalists, cooks, computer programmers, engineers, and a thousand more positions that are absolutely essential to the industry. You don’t have to have
experience specifically in the oil industry in order to have relevant experience.
The oil patch is a little bit different from most other industries. You’ll soon lose the idea of a weekend as you now know it... The patch runs seven days a week, and in
many cases, 24 hours a day. You’ll be expected to work every day in all weather conditions, for weeks or even months at a time. The oil industry is also very production
oriented; you’ll make more money welding in the oil patch than in another industry, but you’ll work longer and harder for that bigger paycheck.
There are a few prerequisites if you want a field job in the oil patch:
You must be in reasonably good physical condition, and be able to lift at least 50 lbs. regularly.
For most positions, you must have a valid driver’s license.
You must have suitable clothing for extended outdoor work and in most cases, hard toed safety boots.
You should not have any medical condition which would make it unsafe for you to operate machinery.
You don’t need to live in the city where your employer is located, but in most cases you will have to provide your own transportation to and from your home from
the employer’s location (point-of-hire). If you live a long way from any area with oil and gas activity, you will have a very difficult time finding an entry level job in
this industry.
You must be willing and able to work hard for long hours. This industry is all about production, and if you don’t produce, you’re not an asset to the company.
You must be drug-free. Most companies conduct pre-employment drug screenings and random testing of employees. If your test show signs of illegal drugs in your
system, you will not be hired. Most oil work requires you to live away from home, in motels or camps near the jobs. Your travel, accommodations, and meals will
usually be paid by your employer while you’re working. Most companies also provide all required safety supplies, such as hard hats and reflective safety vests. You
are required to supply your own work clothes, boots, gloves, etc.
Before you leave for your first job, be sure you have appropriate clothing to spend 14 hours outside... frostbite isn’t fun, neither is heat stroke.
Much of the work in the oil industry is very physically demanding, especially in the entry level positions. There is no upper age limit, but you should be willing and able to
work hard for long hours, lift 50 lbs regularly, and be in relatively good physical condition. If you have back or other health problems that prevent strenuous activity, you
may want to reconsider this line of work. Most companies require employees to
be at least 18 years old. A recent hearing test and/or medical evaluation may be required.
Many oilfield companies also require a preemployment drug and alcohol screening. You should know that though you can make a lot of money in a month in the oil patch,
you can also make no money in a month. Most oilfield work isn’t very stable, and you’ll occasionally find yourself laid-off on short notice due to a shortage of work... and
called back on even shorter notice. Many people in Canada work in the oil
industry during the winter while it’s busy, then take the spring and summer off, or work non-oilfield summer jobs.
Offshore and overseas rigs usually operate yearround, offering a much more stable work environment; but there are very few positions on these rigs that are available
without any experience. If you’re interested in working on one of these rigs, you may want to start with a catering job. All major offshore and overseas projects employ
catering staff to provide meals for the rig crew. These positions are often available without experience, and rig managers will often hire catering staff onto the rig crew if
they need an extra hand, or if a member of the rig crew gets injured or leaves. It’s a matter of being in the right place at the right time, and showing interest in working
on the rig.
Available at: <http://www.oilfi eldworkers.com/oilfi eldintro.php> Retrieved on: Aug. 29, 2012
According to Text I, workers in the oil industry can be expected to bear all of the following working conditions, EXCEPT
a) working hard for long hours in order to keep up oil production.
b) having to perform risky jobs in exchange for guaranteed promotions.
c) spending weekends and holidays on the job, sometimes for long periods.
d) facing adverse weather conditions for long stretches of time to ensure productivity.
e) being on duty away from home and resorting to individual transportation to the job post.
www.tecconcursos.com.br/questoes/213710
So you’re thinking about a field job in the oil industry. If you haven’t been involved in the oil patch before, you probably have no idea how vast it is, or where to start your
job search. Many sites will try to convince you that you can get a job on an offshore rig making $10,000 a month without any experience or training at all, and while this is
possible, it’s not at all likely. Actually, it can be tough to find a job in any field of the oil industry without some experience or training.
First, you should realize that the oil industry isn’t just drilling rigs, pumpjacks, and gas stations. The oil industry is a lot like the military in that it employs people in nearly
every profession. There are positions such as roughneck or airgun operator, that are very specific to the oil industry; but there are also welders, medics, chemists,
biologists, environmentalists, cooks, computer programmers, engineers, and a thousand more positions that are absolutely essential to the industry. You don’t have to have
experience specifically in the oil industry in order to have relevant experience.
The oil patch is a little bit different from most other industries. You’ll soon lose the idea of a weekend as you now know it... The patch runs seven days a week, and in
many cases, 24 hours a day. You’ll be expected to work every day in all weather conditions, for weeks or even months at a time. The oil industry is also very production
oriented; you’ll make more money welding in the oil patch than in another industry, but you’ll work longer and harder for that bigger paycheck.
There are a few prerequisites if you want a field job in the oil patch:
You must be in reasonably good physical condition, and be able to lift at least 50 lbs. regularly.
For most positions, you must have a valid driver’s license.
You must have suitable clothing for extended outdoor work and in most cases, hard toed safety boots.
You should not have any medical condition which would make it unsafe for you to operate machinery.
You don’t need to live in the city where your employer is located, but in most cases you will have to provide your own transportation to and from your home from
the employer’s location (point-of-hire). If you live a long way from any area with oil and gas activity, you will have a very difficult time finding an entry level job in
this industry.
You must be willing and able to work hard for long hours. This industry is all about production, and if you don’t produce, you’re not an asset to the company.
Before you leave for your first job, be sure you have appropriate clothing to spend 14 hours outside... frostbite isn’t fun, neither is heat stroke.
Much of the work in the oil industry is very physically demanding, especially in the entry level positions. There is no upper age limit, but you should be willing and able to
work hard for long hours, lift 50 lbs regularly, and be in relatively good physical condition. If you have back or other health problems that prevent strenuous activity, you
may want to reconsider this line of work. Most companies require employees to
be at least 18 years old. A recent hearing test and/or medical evaluation may be required.
Many oilfield companies also require a preemployment drug and alcohol screening. You should know that though you can make a lot of money in a month in the oil patch,
you can also make no money in a month. Most oilfield work isn’t very stable, and you’ll occasionally find yourself laid-off on short notice due to a shortage of work... and
called back on even shorter notice. Many people in Canada work in the oil
industry during the winter while it’s busy, then take the spring and summer off, or work non-oilfield summer jobs.
Offshore and overseas rigs usually operate yearround, offering a much more stable work environment; but there are very few positions on these rigs that are available
without any experience. If you’re interested in working on one of these rigs, you may want to start with a catering job. All major offshore and overseas projects employ
catering staff to provide meals for the rig crew. These positions are often available without experience, and rig managers will often hire catering staff onto the rig crew if
they need an extra hand, or if a member of the rig crew gets injured or leaves. It’s a matter of being in the right place at the right time, and showing interest in working
on the rig.
Available at: <http://www.oilfi eldworkers.com/oilfi eldintro.php> Retrieved on: Aug. 29, 2012
www.tecconcursos.com.br/questoes/213719
So you’re thinking about a field job in the oil industry. If you haven’t been involved in the oil patch before, you probably have no idea how vast it is, or where to start your
job search. Many sites will try to convince you that you can get a job on an offshore rig making $10,000 a month without any experience or training at all, and while this is
possible, it’s not at all likely. Actually, it can be tough to find a job in any field of the oil industry without some experience or training.
First, you should realize that the oil industry isn’t just drilling rigs, pumpjacks, and gas stations. The oil industry is a lot like the military in that it employs people in nearly
every profession. There are positions such as roughneck or airgun operator, that are very specific to the oil industry; but there are also welders, medics, chemists,
biologists, environmentalists, cooks, computer programmers, engineers, and a thousand more positions that are absolutely essential to the industry. You don’t have to have
experience specifically in the oil industry in order to have relevant experience.
The oil patch is a little bit different from most other industries. You’ll soon lose the idea of a weekend as you now know it... The patch runs seven days a week, and in
many cases, 24 hours a day. You’ll be expected to work every day in all weather conditions, for weeks or even months at a time. The oil industry is also very production
oriented; you’ll make more money welding in the oil patch than in another industry, but you’ll work longer and harder for that bigger paycheck.
There are a few prerequisites if you want a field job in the oil patch:
You must be in reasonably good physical condition, and be able to lift at least 50 lbs. regularly.
For most positions, you must have a valid driver’s license.
You must have suitable clothing for extended outdoor work and in most cases, hard toed safety boots.
You should not have any medical condition which would make it unsafe for you to operate machinery.
You don’t need to live in the city where your employer is located, but in most cases you will have to provide your own transportation to and from your home from
the employer’s location (point-of-hire). If you live a long way from any area with oil and gas activity, you will have a very difficult time finding an entry level job in
this industry.
You must be willing and able to work hard for long hours. This industry is all about production, and if you don’t produce, you’re not an asset to the company.
You must be drug-free. Most companies conduct pre-employment drug screenings and random testing of employees. If your test show signs of illegal drugs in your
system, you will not be hired. Most oil work requires you to live away from home, in motels or camps near the jobs. Your travel, accommodations, and meals will
usually be paid by your employer while you’re working. Most companies also provide all required safety supplies, such as hard hats and reflective safety vests. You
are required to supply your own work clothes, boots, gloves, etc.
Before you leave for your first job, be sure you have appropriate clothing to spend 14 hours outside... frostbite isn’t fun, neither is heat stroke.
Much of the work in the oil industry is very physically demanding, especially in the entry level positions. There is no upper age limit, but you should be willing and able to
work hard for long hours, lift 50 lbs regularly, and be in relatively good physical condition. If you have back or other health problems that prevent strenuous activity, you
may want to reconsider this line of work. Most companies require employees to be at least 18 years old. A recent hearing test and/or medical evaluation may be required.
Many oilfield companies also require a preemployment drug and alcohol screening. You should know that though you can make a lot of money in a month in the oil patch,
you can also make no money in a month. Most oilfield work isn’t very stable, and you’ll occasionally find yourself laid-off on short notice due to a shortage of work... and
called back on even shorter notice. Many people in Canada work in the oil industry during the winter while it’s busy, then take the spring and summer off, or work non-
oilfield summer jobs.
Offshore and overseas rigs usually operate yearround, offering a much more stable work environment; but there are very few positions on these rigs that are available
without any experience. If you’re interested in working on one of these rigs, you may want to start with a catering job. All major offshore and overseas projects employ
catering staff to provide meals for the rig crew. These positions are often available without experience, and rig managers will often hire catering staff onto the rig crew if
they need an extra hand, or if a member of the rig crew gets injured or leaves. It’s a matter of being in the right place at the right time, and showing interest in working
on the rig.
Available at: <http://www.oilfi eldworkers.com/oilfi eldintro.php> Retrieved on: Aug. 29, 2012
The fragment “frostbite isn’t fun, neither is heat stroke” (line 29) refers to the fact that the
a) oil industry offers many stressful challenges but also several moments of leisure.
b) different outside temperatures force professionals in the oil industry to work long hours.
c) different seasons during the year affect the free hours of workers in the oil industry.
d) workers in the oil industry need to be prepared to survive all kinds of weather conditions.
e) appropriate clothing for severe working conditions must also be comfortable for the warm climate.
www.tecconcursos.com.br/questoes/213728
RIO DE JANEIRO, BRAZIL – Spearheaded by record investment in the petroleum and natural gas industry, Brazil’s job market continues to grow at a breakneck pace. Billion
dollar investments by the government and private companies have created a positive landscape for job seekers, with no sign of abating.
“The demand for professionals will continue to increase. I believe we will see an even larger demand in two to three years due to project maintenance and expansion,”
said Rafael Faria, Head of Business Recruiting in Oil & Gas for a global recruiting corporation.
With investments of US$224 billion over the next four years by the major Brazilian oil and gas company, as well as investments by almost all major multinational oil
companies in the exploration of new oil and gas fields, qualified workers are a hot commodity. An estimate from the federal government estimates that the new Brazilian
oil fields will require 250,000 new professionals through 2016.
Among the professionals most in demand are operations managers, logistics managers, project managers, contract managers and engineers. According to Faria, one of
the most challenging positions to fill is the Contract Manager, which requires a good amount of experience in dealing with the large oil companies and their complex rules
and regulations.
“Human Resource managers are at wits end,” said Rose Santos, Human Resource Manager at an international organization specialized in deepwater engineering services
for the oil industry. “Everyone is fighting for the best professionals. Engineers are getting hired right out of college.”
Most universities offer an undergraduate degree in Petroleum Engineering, and it has become the most sought-after course, passing medicine.
But not only managers are in high demand, skilled workers to build, maintain, repair and perform technical installations on the drill rigs, platforms, ships and other offshore
and onshore structures are essential.
Training courses and programs are trying to keep up with the demand. SENAI (Professional training school) has doubled the number of professional training courses in the
last four years. PROMINP, Programa de Mobilização da Indústria de Petróleo e Gás Natural, a training program developed in 2003 in conjunction with a major oil company
to train ‘blue collar’ workers, plans to turn out 212,000 professionals
by 2014.
Some companies opt to search beyond Brazil’s borders to find professionals. Many of the multinational companies that previously had only a single representative in Brazil,
are looking to extend their presence and have to import talent. Work visas can be a challenge to obtain though, and permanent visas also involve significant immigration
procedures.
While many companies tend to import professionals from their home base, according to Santos, it is common practice to try to replace them
with Brazilians within two to three years, due to the high costs.
Faria agrees, “Hiring foreigners can cost up to three times the salary paid to a Brazilian. The cost includes school for their children, moving expenses, room and board and
a car.”
For foreigners considering a relocation to try their luck in Brazil’s heated job market, it is important to do the research and evaluate carefully.
“Maybe in three to five years it may be worth it for middle managers, but it will depend on the exchange rate and changes in governmental policy, which I don’t see on the
horizon,” said Faria.
www.tecconcursos.com.br/questoes/213732
RIO DE JANEIRO, BRAZIL – Spearheaded by record investment in the petroleum and natural gas industry, Brazil’s job market continues to grow at a breakneck pace. Billion
“The demand for professionals will continue to increase. I believe we will see an even larger demand in two to three years due to project maintenance and expansion,”
said Rafael Faria, Head of Business Recruiting in Oil & Gas for a global recruiting corporation.
With investments of US$224 billion over the next four years by the major Brazilian oil and gas company, as well as investments by almost all major multinational oil
companies in the exploration of new oil and gas fields, qualified workers are a hot commodity. An estimate from the federal government estimates that the new Brazilian
oil fields will require 250,000 new professionals through 2016.
Among the professionals most in demand are operations managers, logistics managers, project managers, contract managers and engineers. According to Faria, one of
the most challenging positions to fill is the Contract Manager, which requires a good amount of experience in dealing with the large oil companies and their complex rules
and regulations.
“Human Resource managers are at wits end,” said Rose Santos, Human Resource Manager at an international organization specialized in deepwater engineering services
for the oil industry. “Everyone is fighting for the best professionals. Engineers are getting hired right out of college.”
Most universities offer an undergraduate degree in Petroleum Engineering, and it has become the most sought-after course, passing medicine.
But not only managers are in high demand, skilled workers to build, maintain, repair and perform technical installations on the drill rigs, platforms, ships and other offshore
and onshore structures are essential.
Training courses and programs are trying to keep up with the demand. SENAI (Professional training school) has doubled the number of professional training courses in the
last four years. PROMINP, Programa de Mobilização da Indústria de Petróleo e Gás Natural, a training program developed in 2003 in conjunction with a major oil company
to train ‘blue collar’ workers, plans to turn out 212,000 professionals
by 2014.
Some companies opt to search beyond Brazil’s borders to find professionals. Many of the multinational companies that previously had only a single representative in Brazil,
are looking to extend their presence and have to import talent. Work visas can be a challenge to obtain though, and permanent visas also involve significant immigration
procedures.
While many companies tend to import professionals from their home base, according to Santos, it is common practice to try to replace them
with Brazilians within two to three years, due to the high costs.
Faria agrees, “Hiring foreigners can cost up to three times the salary paid to a Brazilian. The cost includes school for their children, moving expenses, room and board and
a car.”
For foreigners considering a relocation to try their luck in Brazil’s heated job market, it is important to do the research and evaluate carefully.
“Maybe in three to five years it may be worth it for middle managers, but it will depend on the exchange rate and changes in governmental policy, which I don’t see on the
horizon,” said Faria.
Concerning the future of the oil job market, Text II suggests that
a) petroleum and natural gas industries will soon be facing a shortage of skilled workers in the global market.
b) qualified professionals for specific positions in the oil industry will find more opportunities in the Brazilian job market.
c) factory floor staff with technical skills will soon be replaced by specialized employees with a university degree.
d) local expertise will be outnumbered by foreign professionals, since Brazilian engineers are not qualified for the oil industry.
e) more jobs are going to be created to attract a higher number of foreign professionals to the Brazilian oil industry in the next decade.
www.tecconcursos.com.br/questoes/213734
So you’re thinking about a field job in the oil industry. If you haven’t been involved in the oil patch before, you probably have no idea how vast it is, or where to start your
job search. Many sites will try to convince you that you can get a job on an offshore rig making $10,000 a month without any experience or training at all, and while this is
possible, it’s not at all likely. Actually, it can be tough to find a job in any field of the oil industry without some experience or training.
First, you should realize that the oil industry isn’t just drilling rigs, pumpjacks, and gas stations. The oil industry is a lot like the military in that it employs people in nearly
every profession. There are positions such as roughneck or airgun operator, that are very specific to the oil industry; but there are also welders, medics, chemists,
biologists, environmentalists, cooks, computer programmers, engineers, and a thousand more positions that are absolutely essential to the industry. You don’t have to have
experience specifically in the oil industry in order to have relevant experience.
The oil patch is a little bit different from most other industries. You’ll soon lose the idea of a weekend as you now know it... The patch runs seven days a week, and in
many cases, 24 hours a day. You’ll be expected to work every day in all weather conditions, for weeks or even months at a time. The oil industry is also very production
oriented; you’ll make more money welding in the oil patch than in another industry, but you’ll work longer and harder for that bigger paycheck.
There are a few prerequisites if you want a field job in the oil patch:
You must be in reasonably good physical condition, and be able to lift at least 50 lbs. regularly.
For most positions, you must have a valid driver’s license.
You must have suitable clothing for extended outdoor work and in most cases, hard toed safety boots.
You should not have any medical condition which would make it unsafe for you to operate machinery.
You don’t need to live in the city where your employer is located, but in most cases you will have to provide your own transportation to and from your home from
the employer’s location (point-of-hire). If you live a long way from any area with oil and gas activity, you will have a very difficult time finding an entry level job in
this industry.
You must be willing and able to work hard for long hours. This industry is all about production, and if you don’t produce, you’re not an asset to the company.
You must be drug-free. Most companies conduct pre-employment drug screenings and random testing of employees. If your test show signs of illegal drugs in your
system, you will not be hired. Most oil work requires you to live away from home, in motels or camps near the jobs. Your travel, accommodations, and meals will
Before you leave for your first job, be sure you have appropriate clothing to spend 14 hours outside... frostbite isn’t fun, neither is heat stroke.
Much of the work in the oil industry is very physically demanding, especially in the entry level positions. There is no upper age limit, but you should be willing and able to
work hard for long hours, lift 50 lbs regularly, and be in relatively good physical condition. If you have back or other health problems that prevent strenuous activity, you
may want to reconsider this line of work. Most companies require employees to be at least 18 years old. A recent hearing test and/or medical evaluation may be required.
Many oilfield companies also require a preemployment drug and alcohol screening. You should know that though you can make a lot of money in a month in the oil patch,
you can also make no money in a month. Most oilfield work isn’t very stable, and you’ll occasionally find yourself laid-off on short notice due to a shortage of work... and
called back on even shorter notice. Many people in Canada work in the oil industry during the winter while it’s busy, then take the spring and summer off, or work non-
oilfield summer jobs.
Offshore and overseas rigs usually operate yearround, offering a much more stable work environment; but there are very few positions on these rigs that are available
without any experience. If you’re interested in working on one of these rigs, you may want to start with a catering job. All major offshore and overseas projects employ
catering staff to provide meals for the rig crew. These positions are often available without experience, and rig managers will often hire catering staff onto the rig crew if
they need an extra hand, or if a member of the rig crew gets injured or leaves. It’s a matter of being in the right place at the right time, and showing interest in working
on the rig.
Available at: <http://www.oilfi eldworkers.com/oilfi eldintro.php> Retrieved on: Aug. 29, 2012
Text II
RIO DE JANEIRO, BRAZIL – Spearheaded by record investment in the petroleum and natural gas industry, Brazil’s job market continues to grow at a breakneck pace. Billion
dollar investments by the government and private companies have created a positive landscape for job seekers, with no sign of abating.
“The demand for professionals will continue to increase. I believe we will see an even larger demand in two to three years due to project maintenance and expansion,”
said Rafael Faria, Head of Business Recruiting in Oil & Gas for a global recruiting corporation.
With investments of US$224 billion over the next four years by the major Brazilian oil and gas company, as well as investments by almost all major multinational oil
companies in the exploration of new oil and gas fields, qualified workers are a hot commodity. An estimate from the federal government estimates that the new Brazilian
oil fields will require 250,000 new professionals through 2016.
Among the professionals most in demand are operations managers, logistics managers, project managers, contract managers and engineers. According to Faria, one of
the most challenging positions to fill is the Contract Manager, which requires a good amount of experience in dealing with the large oil companies and their complex rules
and regulations.
“Human Resource managers are at wits end,” said Rose Santos, Human Resource Manager at an international organization specialized in deepwater engineering services
for the oil industry. “Everyone is fighting for the best professionals. Engineers are getting hired right out of college.”
Most universities offer an undergraduate degree in Petroleum Engineering, and it has become the most sought-after course, passing medicine.
But not only managers are in high demand, skilled workers to build, maintain, repair and perform technical installations on the drill rigs, platforms, ships and other offshore
and onshore structures are essential.
Training courses and programs are trying to keep up with the demand. SENAI (Professional training school) has doubled the number of professional training courses in the
last four years. PROMINP, Programa de Mobilização da Indústria de Petróleo e Gás Natural, a training program developed in 2003 in conjunction with a major oil company
to train ‘blue collar’ workers, plans to turn out 212,000 professionals
by 2014.
Some companies opt to search beyond Brazil’s borders to find professionals. Many of the multinational companies that previously had only a single representative in Brazil,
are looking to extend their presence and have to import talent. Work visas can be a challenge to obtain though, and permanent visas also involve significant immigration
procedures.
While many companies tend to import professionals from their home base, according to Santos, it is common practice to try to replace them
with Brazilians within two to three years, due to the high costs.
Faria agrees, “Hiring foreigners can cost up to three times the salary paid to a Brazilian. The cost includes school for their children, moving expenses, room and board and
a car.”
For foreigners considering a relocation to try their luck in Brazil’s heated job market, it is important to do the research and evaluate carefully.
“Maybe in three to five years it may be worth it for middle managers, but it will depend on the exchange rate and changes in governmental policy, which I don’t see on the
horizon,” said Faria.
www.tecconcursos.com.br/questoes/286264
Protecting Patagonia
At the southernmost tip of South America, one of the last untouched expanses of land on the planet is now under threat of irreversible destruction.
Chile’s Patagonia is home to the snow-capped Andes, dense temperate rainforests, lush valleys and meadows, abundant marine and bird species and traditional
communities living a low-impact lifestyle. All of these could be devastated if a proposed hydroelectric complex called HidroAysén is constructed on two of the last free-
flowing rivers in the world. The massive dams would:
While big energy companies are lobbying for this ill-conceived project, Chile’s vast renewable energy resources remain untapped and its energy efficiency opportunities
unrealized. The Natural Resources Defense Council is working with a coalition to stop the HidroAysén complex and help Chile instead reach its renewable energy and
energy efficiency potential.
Chile has vast renewable energy resources and energy efficiency opportunities and the potential to become a world leader in clean energy technologies. The Chilean
government has signed numerous renewable energy development initiatives with the U.S. and other countries. A key part of these agreements includes information-
sharing, which means that a decision to pursue renewable energy in Chile would help advance alternative energy programs in the U.S. and around the world as we learn
from groundbreaking new programs.
Chile has unparalleled potential for renewable energy and energy efficiency. The country has an abundance of untapped solar, wind and geothermal energy sources, which
could easily meet the country’s future energy needs. All of these alternative solutions are more sustainable, less destructive and more stable than the large hydro-electric
and coal power sources that currently dominate Chile’s energy industry.
NRDC is working with a broad coalition of citizens, community groups and national and international NGOs to oppose the hydro dam project and push for sustainable
energy solutions.
www.tecconcursos.com.br/questoes/286269
The environmental impact of society’s dependence on natural resources is undeniable, though with these resources set up as the lifeblood of modern industry, many
continue to downplay the urgency of freeing ourselves from this dependence. The most compelling and visual consequence of burning carbon based fuels are the resulting
CO2 emissions that act as greenhouse gases, the primary contributor to global warming. However, as more people study how the resources are developed, it becomes
apparent that the environmental damage from these extraction processes may be as significant as the emissions.
A recent article from Scientific American demonstrates how little is known about the dangers of extracting natural gas from the earth. Even though the process had been
deemed safe by the EPA, new research suggests that this may not be the case. Complicating the matter further is the fact that the chemical mixture that companies use in
the extraction is considered a trade secret, and they are resistant to providing this info on the grounds of secrecy.
The technology for burning coal with fewer emissions is expensive yet feasible, but there is no account for the environmental damage done during the extraction process.
Moreover, around the globe, corporations are continually exploiting indigenous regions for their resources. A report from the World Wildlife Fund details the various
problems that are occurring in the Amazon region as a result of resource exploitation. It explains the various effects that oil and gas extraction can have, including
deforestation, regional conflict, biodiversity loss, and soil and aquatic pollution. Selling the rights to these resources to companies can be a strong move economically for
poor regions, but the long-term effects will greatly outweigh the benefits on a global scale.
The need for alternative energy supplies is urgent, and it will only serve the public to discuss the full extent of the environmental dangers that carbon based fuels pose. By
not entering the dangers of extraction into the case for developing clean energy, the argument for implementing these technologies is diminished, and the needs of the
environment will continue to be overshadowed by global events like the current financial crisis.
Available at: <http://common-breath.com/the-environmentalconsequences- of-natural-resource-extraction/>. Retrieved on: Jan. 10th, 2014. Adapted.
www.tecconcursos.com.br/questoes/286270
The environmental impact of society’s dependence on natural resources is undeniable, though with these resources set up as the lifeblood of modern industry, many
continue to downplay the urgency of freeing ourselves from this dependence. The most compelling and visual consequence of burning carbon based fuels are the resulting
CO2 emissions that act as greenhouse gases, the primary contributor to global warming. However, as more people study how the resources are developed, it becomes
apparent that the environmental damage from these extraction processes may be as significant as the emissions.
A recent article from Scientific American demonstrates how little is known about the dangers of extracting natural gas from the earth. Even though the process had been
deemed safe by the EPA, new research suggests that this may not be the case. Complicating the matter further is the fact that the chemical mixture that companies use in
the extraction is considered a trade secret, and they are resistant to providing this info on the grounds of secrecy.
The technology for burning coal with fewer emissions is expensive yet feasible, but there is no account for the environmental damage done during the extraction process.
Moreover, around the globe, corporations are continually exploiting indigenous regions for their resources. A report from the World Wildlife Fund details the various
problems that are occurring in the Amazon region as a result of resource exploitation. It explains the various effects that oil and gas extraction can have, including
deforestation, regional conflict, biodiversity loss, and soil and aquatic pollution. Selling the rights to these resources to companies can be a strong move economically for
poor regions, but the long-term effects will greatly outweigh the benefits on a global scale.
The need for alternative energy supplies is urgent, and it will only serve the public to discuss the full extent of the environmental dangers that carbon based fuels pose. By
not entering the dangers of extraction into the case for developing clean energy, the argument for implementing these technologies is diminished, and the needs of the
environment will continue to be overshadowed by global events like the current financial crisis.
Available at: <http://common-breath.com/the-environmentalconsequences- of-natural-resource-extraction/>. Retrieved on: Jan. 10th, 2014. Adapted.
www.tecconcursos.com.br/questoes/286271
The environmental impact of society’s dependence on natural resources is undeniable, though with these resources set up as the lifeblood of modern industry, many
continue to downplay the urgency of freeing ourselves from this dependence. The most compelling and visual consequence of burning carbon based fuels are the resulting
CO2 emissions that act as greenhouse gases, the primary contributor to global warming. However, as more people study how the resources are developed, it becomes
apparent that the environmental damage from these extraction processes may be as significant as the emissions.
A recent article from Scientific American demonstrates how little is known about the dangers of extracting natural gas from the earth. Even though the process had been
deemed safe by the EPA, new research suggests that this may not be the case. Complicating the matter further is the fact that the chemical mixture that companies use in
the extraction is considered a trade secret, and they are resistant to providing this info on the grounds of secrecy.
The technology for burning coal with fewer emissions is expensive yet feasible, but there is no account for the environmental damage done during the extraction process.
Moreover, around the globe, corporations are continually exploiting indigenous regions for their resources. A report from the World Wildlife Fund details the various
problems that are occurring in the Amazon region as a result of resource exploitation. It explains the various effects that oil and gas extraction can have, including
deforestation, regional conflict, biodiversity loss, and soil and aquatic pollution. Selling the rights to these resources to companies can be a strong move economically for
poor regions, but the long-term effects will greatly outweigh the benefits on a global scale.
The need for alternative energy supplies is urgent, and it will only serve the public to discuss the full extent of the environmental dangers that carbon based fuels pose. By
not entering the dangers of extraction into the case for developing clean energy, the argument for implementing these technologies is diminished, and the needs of the
environment will continue to be overshadowed by global events like the current financial crisis.
Available at: <http://common-breath.com/the-environmentalconsequences- of-natural-resource-extraction/>. Retrieved on: Jan. 10th, 2014. Adapted.
Among the problems which occur in the Amazon region listed by the World Wildlife Fund report, in the 3rd paragraph of Text II, the one that is NOT mentioned is
a) water pollution
b) local conflicts
c) devastation of the forest
d) the diminution of biodiversity
e) the aggravation of the greenhouse effect
www.tecconcursos.com.br/questoes/286304
When the government of the French-speaking province of Quebec introduced a bill in November to stop public servants from wearing religious symbols, it gave a
community hospital in neighbouring Ontario a chance to grab some new recruits. Lakeridge Health ran an advertisement in a Quebec medical-school newspaper showing a
The Quebec government’s proposed ban and the Ontario hospital’s welcome illustrate the poles in the Canadian debate on multiculturalism. Public hearings on the law
began on January 14th. Supporters say that the ban is needed to enshrine state secularism; opponents that it is a cynical appeal to xenophobia by the minority provincial
government of the Parti Québécois (PQ). Either way, the prediction of Jean- François Lisée, a PQ minister, that the Quebec battle could be the last stand in Canada’s
multicultural experiment does not stand up to close scrutiny.
Immigration itself is not in question. Canadians, even in Quebec, overwhelmingly back mass immigration, which adds an average of 250,000 newcomers (roughly 0.8% of
the population) each year. First-generation immigrants make up a bigger share of Toronto’s and Vancouver’s populations than in many of the world’s great cosmopolitan
cities [. . .].
Unlike many Europeans, Canadians believe that immigrants create jobs rather than steal them, says Jeffrey Reitz, a sociologist who has surveyed attitudes in Europe and
Canada. This view is partly based on history. Modern Canada was built by successive waves of immigrants, first from Europe and more recently from Asia.
It is also a result of policies that since the 1970s have focused on admitting the most employable people. The government constantly tweaks its system of awarding points
to prospective immigrants for languages, education and skills, in order to match them with labour-market gaps. Younger applicants currently have an edge. An array of
programmes, many of them focused on the ability to speak languages, help immigrants to settle in.
The Quebec dispute is not over numbers of immigrants, but how to accommodate them. In the 1970s Canada officially adopted the creed of “multiculturalism”, a murky
concept that celebrates cultural differences at the same time as pushing newcomers to integrate. English speaking Canadians see multiculturalism as central to their
national identity, ranking below universal health care and the Canadian flag in a recent survey by Environics, a research firm, but above ice hockey, the Mounties and the
Queen.
The governing Conservatives are blunter than opposition parties about the obligation on newcomers to integrate and about cultural practices, such as genital mutilation,
that are unacceptable. But their support for multiculturalism is not in question. After the latest federal cabinet reshuffle there was even a tussle over who was the senior
multiculturalism minister.
By contrast, French-speaking Quebeckers have long been more tepid about the subject. Many think it undermines their role as one of modern Canada’s founding cultures.
The government in Quebec prefers the doctrine of “interculturalism”, which emphasises assimilation into the dominant culture. This is popular in rural areas, where
immigrants are few and PQ support is strong, but extremely unpopular in Montreal, -where most of the province’s newcomers live.
www.tecconcursos.com.br/questoes/286305
When the government of the French-speaking province of Quebec introduced a bill in November to stop public servants from wearing religious symbols, it gave a
community hospital in neighbouring Ontario a chance to grab some new recruits. Lakeridge Health ran an advertisement in a Quebec medical-school newspaper showing a
woman wearing a hijab and stethoscope over the caption: “We don’t care what’s on your head, we care what’s in it.” Applications doubled, says Kevin Empey, the hospital’s
boss.
The Quebec government’s proposed ban and the Ontario hospital’s welcome illustrate the poles in the Canadian debate on multiculturalism. Public hearings on the law
began on January 14th. Supporters say that the ban is needed to enshrine state secularism; opponents that it is a cynical appeal to xenophobia by the minority provincial
government of the Parti Québécois (PQ). Either way, the prediction of Jean- François Lisée, a PQ minister, that the Quebec battle could be the last stand in Canada’s
multicultural experiment does not stand up to close scrutiny.
Immigration itself is not in question. Canadians, even in Quebec, overwhelmingly back mass immigration, which adds an average of 250,000 newcomers (roughly 0.8% of
the population) each year. First-generation immigrants make up a bigger share of Toronto’s and Vancouver’s populations than in many of the world’s great cosmopolitan
cities [. . .].
Unlike many Europeans, Canadians believe that immigrants create jobs rather than steal them, says Jeffrey Reitz, a sociologist who has surveyed attitudes in Europe and
Canada. This view is partly based on history. Modern Canada was built by successive waves of immigrants, first from Europe and more recently from Asia.
It is also a result of policies that since the 1970s have focused on admitting the most employable people. The government constantly tweaks its system of awarding points
to prospective immigrants for languages, education and skills, in order to match them with labour-market gaps. Younger applicants currently have an edge. An array of
programmes, many of them focused on the ability to speak languages, help immigrants to settle in.
The Quebec dispute is not over numbers of immigrants, but how to accommodate them. In the 1970s Canada officially adopted the creed of “multiculturalism”, a murky
concept that celebrates cultural differences at the same time as pushing newcomers to integrate. English speaking Canadians see multiculturalism as central to their
national identity, ranking below universal health care and the Canadian flag in a recent survey by Environics, a research firm, but above ice hockey, the Mounties and the
Queen.
The governing Conservatives are blunter than opposition parties about the obligation on newcomers to integrate and about cultural practices, such as genital mutilation,
that are unacceptable. But their support for multiculturalism is not in question. After the latest federal cabinet reshuffle there was even a tussle over who was the senior
multiculturalism minister.
By contrast, French-speaking Quebeckers have long been more tepid about the subject. Many think it undermines their role as one of modern Canada’s founding cultures.
The government in Quebec prefers the doctrine of “interculturalism”, which emphasises assimilation into the dominant culture. This is popular in rural areas, where
immigrants are few and PQ support is strong, but extremely unpopular in Montreal, -where most of the province’s newcomers live.
www.tecconcursos.com.br/questoes/286309
Innovation, the elixir of progress, has always cost people their jobs. In the Industrial Revolution artisan weavers were swept aside by the mechanical loom. Over the past
30 years the digital revolution has displaced many of the mid-skill jobs that underpinned 20th-century middle-class life. Typists, ticket agents, bank tellers and many
production-line jobs have been dispensed with, just as the weavers were.
For those, including this newspaper, who believe that technological progress has made the world a better place, such churn is a natural part of rising prosperity. Although
innovation kills some jobs, it creates new and better ones, as a more productive society becomes richer and its wealthier inhabitants demand more goods and services. A
hundred years ago one in three American workers was employed on a farm. Today less than 2% of them produce far more food. The millions freed from the land were
not consigned to joblessness, but found better-paid work as the economy grew more sophisticated. Today the pool of secretaries has shrunk, but there are ever more
computer programmers and web designers.
Optimism remains the right starting-point, but for workers the dislocating effects of technology may make themselves evident faster than its benefits. Technology’s impact
will feel like a tornado, hitting the rich world first, but eventually sweeping through poorer countries too. No government is prepared for it.
Why be worried? It is partly just a matter of history repeating itself. In the early part of the Industrial Revolution the rewards of increasing productivity went
disproportionately to capital; later on, labour reaped most of the benefits. The pattern today is similar. The prosperity unleashed by the Digital Revolution has gone
overwhelmingly to the owners of capital and the highest-skilled workers.
Many of the jobs most at risk are lower down the ladder (logistics, haulage), whereas the skills that are least vulnerable to automation (creativity, managerial expertise)
tend to be higher up, so median wages are likely to remain stagnant for some time and income gaps are likely to widen.
Anger about rising inequality is bound to grow, but politicians will find it hard to address the problem. Shunning progress would be as futile now as the Luddites’ protests
against mechanised looms were in the 1810s, because any country that tried to stop would be left behind by competitors eager to embrace new technology. The freedom
to raise taxes on the rich to punitive levels will be similarly constrained by the mobility of capital and highly skilled labour.
The main way in which governments can help their people through this dislocation is through education systems. One of the reasons for the improvement in workers’
fortunes in the latter part of the Industrial Revolution was because schools were built to educate them—a dramatic change at the time. Now those schools themselves
need to be changed, to foster the creativity that humans will need to set them apart from computers. There should be less rote-learning and more critical thinking.
Innovation has brought great benefits to humanity. Nobody in their right mind would want to return to the world of handloom weavers. But the benefits of technological
progress are unevenly distributed, especially in the early stages of each new wave, and it is up to governments to spread them. In the 19th century it took the threat of
revolution to bring about progressive reforms. Today’s governments would do well to start making the changes needed before their people get angry.
According to Text II, although the Industrial and Digital Revolutions are more than 200 years apart, they have many similarities, EXCEPT that they
www.tecconcursos.com.br/questoes/286310
Innovation, the elixir of progress, has always cost people their jobs. In the Industrial Revolution artisan weavers were swept aside by the mechanical loom. Over the past
30 years the digital revolution has displaced many of the mid-skill jobs that underpinned 20th-century middle-class life. Typists, ticket agents, bank tellers and many
production-line jobs have been dispensed with, just as the weavers were.
For those, including this newspaper, who believe that technological progress has made the world a better place, such churn is a natural part of rising prosperity. Although
innovation kills some jobs, it creates new and better ones, as a more productive society becomes richer and its wealthier inhabitants demand more goods and services. A
hundred years ago one in three American workers was employed on a farm. Today less than 2% of them produce far more food. The millions freed from the land were
not consigned to joblessness, but found better-paid work as the economy grew more sophisticated. Today the pool of secretaries has shrunk, but there are ever more
computer programmers and web designers.
Optimism remains the right starting-point, but for workers the dislocating effects of technology may make themselves evident faster than its benefits. Technology’s impact
Why be worried? It is partly just a matter of history repeating itself. In the early part of the Industrial Revolution the rewards of increasing productivity went
disproportionately to capital; later on, labour reaped most of the benefits. The pattern today is similar. The prosperity unleashed by the Digital Revolution has gone
overwhelmingly to the owners of capital and the highest-skilled workers.
Many of the jobs most at risk are lower down the ladder (logistics, haulage), whereas the skills that are least vulnerable to automation (creativity, managerial expertise)
tend to be higher up, so median wages are likely to remain stagnant for some time and income gaps are likely to widen.
Anger about rising inequality is bound to grow, but politicians will find it hard to address the problem. Shunning progress would be as futile now as the Luddites’ protests
against mechanised looms were in the 1810s, because any country that tried to stop would be left behind by competitors eager to embrace new technology. The freedom
to raise taxes on the rich to punitive levels will be similarly constrained by the mobility of capital and highly skilled labour.
The main way in which governments can help their people through this dislocation is through education systems. One of the reasons for the improvement in workers’
fortunes in the latter part of the Industrial Revolution was because schools were built to educate them—a dramatic change at the time. Now those schools themselves
need to be changed, to foster the creativity that humans will need to set them apart from computers. There should be less rote-learning and more critical thinking.
Innovation has brought great benefits to humanity. Nobody in their right mind would want to return to the world of handloom weavers. But the benefits of technological
progress are unevenly distributed, especially in the early stages of each new wave, and it is up to governments to spread them. In the 19th century it took the threat of
revolution to bring about progressive reforms. Today’s governments would do well to start making the changes needed before their people get angry.
In Text II, it’s implied that innovation is the elixir of progress in both Revolutions, but it has its downside because
www.tecconcursos.com.br/questoes/286314
Innovation, the elixir of progress, has always cost people their jobs. In the Industrial Revolution artisan weavers were swept aside by the mechanical loom. Over the past
30 years the digital revolution has displaced many of the mid-skill jobs that underpinned 20th-century middle-class life. Typists, ticket agents, bank tellers and many
production-line jobs have been dispensed with, just as the weavers were.
For those, including this newspaper, who believe that technological progress has made the world a better place, such churn is a natural part of rising prosperity. Although
innovation kills some jobs, it creates new and better ones, as a more productive society becomes richer and its wealthier inhabitants demand more goods and services. A
hundred years ago one in three American workers was employed on a farm. Today less than 2% of them produce far more food. The millions freed from the land were
not consigned to joblessness, but found better-paid work as the economy grew more sophisticated. Today the pool of secretaries has shrunk, but there are ever more
computer programmers and web designers.
Optimism remains the right starting-point, but for workers the dislocating effects of technology may make themselves evident faster than its benefits. Technology’s impact
will feel like a tornado, hitting the rich world first, but eventually sweeping through poorer countries too. No government is prepared for it.
Why be worried? It is partly just a matter of history repeating itself. In the early part of the Industrial Revolution the rewards of increasing productivity went
disproportionately to capital; later on, labour reaped most of the benefits. The pattern today is similar. The prosperity unleashed by the Digital Revolution has gone
overwhelmingly to the owners of capital and the highest-skilled workers.
Many of the jobs most at risk are lower down the ladder (logistics, haulage), whereas the skills that are least vulnerable to automation (creativity, managerial expertise)
tend to be higher up, so median wages are likely to remain stagnant for some time and income gaps are likely to widen.
Anger about rising inequality is bound to grow, but politicians will find it hard to address the problem. Shunning progress would be as futile now as the Luddites’ protests
against mechanised looms were in the 1810s, because any country that tried to stop would be left behind by competitors eager to embrace new technology. The freedom
to raise taxes on the rich to punitive levels will be similarly constrained by the mobility of capital and highly skilled labour.
The main way in which governments can help their people through this dislocation is through education systems. One of the reasons for the improvement in workers’
fortunes in the latter part of the Industrial Revolution was because schools were built to educate them—a dramatic change at the time. Now those schools themselves
need to be changed, to foster the creativity that humans will need to set them apart from computers. There should be less rote-learning and more critical thinking.
Innovation has brought great benefits to humanity. Nobody in their right mind would want to return to the world of handloom weavers. But the benefits of technological
progress are unevenly distributed, especially in the early stages of each new wave, and it is up to governments to spread them. In the 19th century it took the threat of
revolution to bring about progressive reforms. Today’s governments would do well to start making the changes needed before their people get angry.
World oil market prospects for the second half of the year
[...] World oil demand in 2H14 is anticipated to increase by 1.2 mb/d over the same period last year to average 92.1 mb/d. OECD (Organisation for Economic Co-operation
and Development) demand is projected to decline by around 60 tb/d, despite positive growth in OECD Americas, mainly due to a general improvement in the US economy.
OECD Europe and OECD Asia Pacific are expected to see a lesser contraction than a year earlier. However, oil demand growth in OECD Asia Pacific will largely be impacted
by any restart of nuclear power plants in Japan. Non-OECD countries are projected to lead oil demand growth this year and forecast to add 1.3 mb/d in 2H14 compared to
the same period a year ago. Nevertheless, risks to the forecast include the pace of economic growth in major economies in the OECD, China, India and Russia, as well as
policy reforms in retail prices and substitution toward natural gas.
On the supply side, non-OPEC oil supply in the second half of the year is expected to increase by 1.2 mb/d over the same period last year to average around 55.9 mb/d,
with the US being the main driver for growth, followed by Canada. Production in Russia and Brazil is also expected to increase in 2H14. However, oil output from the UK
and Mexico is projected to continue to decline. The forecast for non-OPEC supply growth for 2H14 is seen lower than in the first half of the year, but could increase given
forecasts for a mild hurricane season in the US Gulf. Less field maintenance in the North Sea and easing geopolitical tensions could also add further barrels in the coming
two quarters. OPEC NGLs are also projected to continue to increase, adding 0.2 mb/d in 2H14 to stand at 5.9 mb/d.
Taking these developments into account, the supply-demand balance for 2H14 shows that the demand for OPEC crude in the second half of the year stands at around 30.3
mb/d, slightly higher than in the first half of the year. This compares to OPEC production, according to secondary sources, of close to 30.0 mb/d in May. Global inventories
are at sufficient levels, with OECD commercial stocks in days of forward cover at around 58 days in April. Moreover, inventories in the US – the only OECD country with
positive demand growth – stand at high levels. Non- OECD inventories are also on the rise, especially in China, which has been building Strategic Petroleum Reserves
(SPR) at a time when apparent demand is weakening due to slowing economic activities. [...]
Available at: <http://www.opec.org/opec_web/static_fi les_project/ media/download/publications/MOMR_June_2014.pdf>. Retrieved on: 15 June 2014. Adapted.
www.tecconcursos.com.br/questoes/2381752
World oil market prospects for the second half of the year
[...] World oil demand in 2H14 is anticipated to increase by 1.2 mb/d over the same period last year to average 92.1 mb/d. OECD (Organisation for Economic Co-operation
and Development) demand is projected to decline by around 60 tb/d, despite positive growth in OECD Americas, mainly due to a general improvement in the US economy.
OECD Europe and OECD Asia Pacific are expected to see a lesser contraction than a year earlier. However, oil demand growth in OECD Asia Pacific will largely be impacted
by any restart of nuclear power plants in Japan. Non-OECD countries are projected to lead oil demand growth this year and forecast to add 1.3 mb/d in 2H14 compared to
the same period a year ago. Nevertheless, risks to the forecast include the pace of economic growth in major economies in the OECD, China, India and Russia, as well as
policy reforms in retail prices and substitution toward natural gas.
On the supply side, non-OPEC oil supply in the second half of the year is expected to increase by 1.2 mb/d over the same period last year to average around 55.9 mb/d,
with the US being the main driver for growth, followed by Canada. Production in Russia and Brazil is also expected to increase in 2H14. However, oil output from the UK
and Mexico is projected to continue to decline. The forecast for non-OPEC supply growth for 2H14 is seen lower than in the first half of the year, but could increase given
forecasts for a mild hurricane season in the US Gulf. Less field maintenance in the North Sea and easing geopolitical tensions could also add further barrels in the coming
two quarters. OPEC NGLs are also projected to continue to increase, adding 0.2 mb/d in 2H14 to stand at 5.9 mb/d.
Taking these developments into account, the supply-demand balance for 2H14 shows that the demand for OPEC crude in the second half of the year stands at around 30.3
mb/d, slightly higher than in the first half of the year. This compares to OPEC production, according to secondary sources, of close to 30.0 mb/d in May. Global inventories
are at sufficient levels, with OECD commercial stocks in days of forward cover at around 58 days in April. Moreover, inventories in the US – the only OECD country with
positive demand growth – stand at high levels. Non- OECD inventories are also on the rise, especially in China, which has been building Strategic Petroleum Reserves
(SPR) at a time when apparent demand is weakening due to slowing economic activities. [...]
Available at: <http://www.opec.org/opec_web/static_fi les_project/ media/download/publications/MOMR_June_2014.pdf>. Retrieved on: 15 June 2014. Adapted.
According to Text I, the statement “OECD Europe and OECD Asia Pacific are expected to see a lesser contraction than a year earlier” implies that the oil demand in those
countries
www.tecconcursos.com.br/questoes/2381753
World oil market prospects for the second half of the year
[...] World oil demand in 2H14 is anticipated to increase by 1.2 mb/d over the same period last year to average 92.1 mb/d. OECD (Organisation for Economic Co-operation
and Development) demand is projected to decline by around 60 tb/d, despite positive growth in OECD Americas, mainly due to a general improvement in the US economy.
OECD Europe and OECD Asia Pacific are expected to see a lesser contraction than a year earlier. However, oil demand growth in OECD Asia Pacific will largely be impacted
by any restart of nuclear power plants in Japan. Non-OECD countries are projected to lead oil demand growth this year and forecast to add 1.3 mb/d in 2H14 compared to
the same period a year ago. Nevertheless, risks to the forecast include the pace of economic growth in major economies in the OECD, China, India and Russia, as well as
policy reforms in retail prices and substitution toward natural gas.
On the supply side, non-OPEC oil supply in the second half of the year is expected to increase by 1.2 mb/d over the same period last year to average around 55.9 mb/d,
with the US being the main driver for growth, followed by Canada. Production in Russia and Brazil is also expected to increase in 2H14. However, oil output from the UK
and Mexico is projected to continue to decline. The forecast for non-OPEC supply growth for 2H14 is seen lower than in the first half of the year, but could increase given
forecasts for a mild hurricane season in the US Gulf. Less field maintenance in the North Sea and easing geopolitical tensions could also add further barrels in the coming
two quarters. OPEC NGLs are also projected to continue to increase, adding 0.2 mb/d in 2H14 to stand at 5.9 mb/d.
Taking these developments into account, the supply-demand balance for 2H14 shows that the demand for OPEC crude in the second half of the year stands at around 30.3
mb/d, slightly higher than in the first half of the year. This compares to OPEC production, according to secondary sources, of close to 30.0 mb/d in May. Global inventories
are at sufficient levels, with OECD commercial stocks in days of forward cover at around 58 days in April. Moreover, inventories in the US – the only OECD country with
positive demand growth – stand at high levels. Non- OECD inventories are also on the rise, especially in China, which has been building Strategic Petroleum Reserves
(SPR) at a time when apparent demand is weakening due to slowing economic activities. [...]
Available at: <http://www.opec.org/opec_web/static_fi les_project/ media/download/publications/MOMR_June_2014.pdf>. Retrieved on: 15 June 2014. Adapted.
According to Text I, the statement “On the supply side, non-OPEC oil supply in the second half of the year is expected to increase by 1.2 mb/d over the same period last
year to average around 55.9 mb/d, with the US being the main driver for growth, followed by Canada” implies that
www.tecconcursos.com.br/questoes/2381798
The global oil market will undergo sweeping changes over the next five years. The 2013 Medium- Term Oil Market Report evaluates the impact of these changes on the
global oil system by 2018 based on all that we know today – current expectations of economic growth, existing or announced policies and regulations, commercially proven
technologies, field decline rates, investment programmes (upstream, midstream and downstream), etc. The five-year forecast period corresponds to the length of the
typical investment cycle and as such is critical to policymakers and market participants.
This Report shows, in detailed but concise terms, why the ongoing North American hydrocarbon revolution is a ‘game changer’. The region’s expected contribution to
supply growth, however impressive, is only part of the story: Crude quality, infrastructure requirements, current regulations, and the potential for replication elsewhere are
bound to spark a chain reaction that will leave few links in the global oil supply chain unaffected.
While North America is expected to lead mediumterm supply growth, the East-of- Suez region is in the lead on the demand side. Non-OECD oil demand, led by Asia and
the Middle East, looks set to overtake the OECD for the first time as early as 2Q13 and will widen its lead afterwards. Non-OECD economies are already home to over half
global refining capacity. With that share only expected to grow by 2018, the non-OECD region will be firmly entrenched as the world’s largest crude importer.
These and other changes are carefully lai out in this Report, which also examines recent and future changes in global oil storage, shifts in OPEC production capacity and
crude and product trade, and the consequences of the ongoing refinery construction boom in emerging markets and developing economies.,
It is required reading for anyone engaged in policy or investment decision-making in the energy sphere, and those more broadly interested in the oil market and the global
economy.
Comparing the excerpt from Text I “Non-OECD countries are projected to lead oil demand growth this year and forecast to add 1.3 mb/d in 2H14 compared to the same
period a year ago” (lines 13-15) to the excerpt from Text II “Non-OECD oil demand, led by Asia and the Middle East, looks set to overtake the OECD for the first time as
early as 2Q13 and will widen its lead afterwards” (lines 24-27), one states that Text number
www.tecconcursos.com.br/questoes/135958
CESGRANRIO - PB (BNDES)/BNDES/Psicologia/2013
Língua Inglesa (Inglês) - Interpretação de Textos (Understanding)
134) Coworking: Sharing How We Work
Genevieve DeGuzman
Communication
In the past, when trying to find places to work, independent workers, small businesses, and organizations often had to choose between several scenarios, all with their
attendant advantages and disadvantages: working from home; working from a coffee shop, library, or other public venue; or leasing an executive suite or other
commercial space.
“We can come out of hiding,” a coworker tells us, “and be in a space that’s comfortable, friendly, and has an aesthetic appeal that’s a far cry from the typical cookie-cutter
office environment.”
For many, it might be puzzling to pay for a wellequipped space teeming with other people, even with the chance of free coffee and inspiration. You might ask yourself,
“Well, why pay for a place to work when I’m perfectly comfortable at home and paying nothing?” Or, “Isn’t the whole point of telecommuting or starting my own business a
chance to avoid ‘going to the office’?”
Coworking may sound like an unnecessary expense, but let’s consider what you get from being a part of the space.
At its most basic level, coworking is the phenomenon of workers coming together in a shared or collaborative workspace for one or more of these reasons: to reduce costs
by having shared facilities and equipment, to access a community of fellow entrepreneurs, and to seek out collaboration within and across fields. Coworking spaces offer
an exciting alternative for people longing to escape the confines of their cubicle walls, the isolation of working solo at home, or the inconveniences of public venues.
The benefits and cost-savings in productivity and overall happiness and well-being reaped from coworking are also potentially huge. Enthusiasm and creativity become
contagious and multiply when you diversify your work environment with people from different fields or backgrounds. At coworking spaces, members pass each other
during the day, conversations get going, and miraculously idea-fusion happens with everyone benefitting from the shared thinking and brainstorming.
Differences matter. Coworking hinges on the belief that innovation and inspiration come from the cross-pollination of different people in different fields or specializations.
Random opportunities and discoveries that arise from interactions with others play a large role in coworking.
To see this in action on a large scale, think about Google. Google made the culture of sharing and collaboration in the workplace legend. It deployed “grouplets” for
initiatives that cover broader changes through the organization.
One remarkable story of a successful Google grouplet involved getting engineers to write their own testing code to reduce the incidence of bugs in software code. Thinking
creatively, the grouplet came up with a campaign based on posting episodes discussing new and interesting testing techniques on the bathroom stalls. “Testing on the
Toilet” spread fast and garnered both rants and raves. Soon, people were hungry for more, and the campaign ultimately developed enough inertia to become a de facto
part of the coding culture. They moved out of the restrooms and into the mainstream.
Keith Sawyer, a professor of psychology and education at Washington University in St. Louis, MO, has written widely on collaboration and innovation. In his study of jazz
performances, Keith Sawyer made this observation, “The group has the ideas, not the individual musicians.” Some of the most famous products were born out of this mosh
pit of interaction — in contrast to the romantic idea of a lone working genius driving change. According to Sawyer, more often than not, true innovation emerges from an
improvised process and draws from trial-by-error and many inputs.
Unexpected insights emerge from the group dynamic. If increasing interaction among different peer groups within a single company could lead to promising results,
imagine the possibilities for solopreneurs, small businesses, and indie workers — if only they could reach similar levels of peer access as those experienced by their bigger
counterparts. It is this potential that coworking tries to capture for its members.
www.tecconcursos.com.br/questoes/135962
CESGRANRIO - PB (BNDES)/BNDES/Psicologia/2013
Língua Inglesa (Inglês) - Interpretação de Textos (Understanding)
135) Coworking: Sharing How We Work
Genevieve DeGuzman
Communication
In the past, when trying to find places to work, independent workers, small businesses, and organizations often had to choose between several scenarios, all with their
attendant advantages and disadvantages: working from home; working from a coffee shop, library, or other public venue; or leasing an executive suite or other
commercial space.
Coworking takes freelancers, indie workers, and entrepreneurs who feel that they have been dormant or isolated working alone at home or who have been migrating from
a coffee shop to a friend’s garage or languishing in a sterile business center — to a space where they can truly roost.
“We can come out of hiding,” a coworker tells us, “and be in a space that’s comfortable, friendly, and has an aesthetic appeal that’s a far cry from the typical cookie-cutter
office environment.”
For many, it might be puzzling to pay for a wellequipped space teeming with other people, even with the chance of free coffee and inspiration. You might ask yourself,
“Well, why pay for a place to work when I’m perfectly comfortable at home and paying nothing?” Or, “Isn’t the whole point of telecommuting or starting my own business a
chance to avoid ‘going to the office’?”
Coworking may sound like an unnecessary expense, but let’s consider what you get from being a part of the space.
At its most basic level, coworking is the phenomenon of workers coming together in a shared or collaborative workspace for one or more of these reasons: to reduce costs
by having shared facilities and equipment, to access a community of fellow entrepreneurs, and to seek out collaboration within and across fields. Coworking spaces offer
an exciting alternative for people longing to escape the confines of their cubicle walls, the isolation of working solo at home, or the inconveniences of public venues.
The benefits and cost-savings in productivity and overall happiness and well-being reaped from coworking are also potentially huge. Enthusiasm and creativity become
contagious and multiply when you diversify your work environment with people from different fields or backgrounds. At coworking spaces, members pass each other
during the day, conversations get going, and miraculously idea-fusion happens with everyone benefitting from the shared thinking and brainstorming.
To see this in action on a large scale, think about Google. Google made the culture of sharing and collaboration in the workplace legend. It deployed “grouplets” for
initiatives that cover broader changes through the organization.
One remarkable story of a successful Google grouplet involved getting engineers to write their own testing code to reduce the incidence of bugs in software code. Thinking
creatively, the grouplet came up with a campaign based on posting episodes discussing new and interesting testing techniques on the bathroom stalls. “Testing on the
Toilet” spread fast and garnered both rants and raves. Soon, people were hungry for more, and the campaign ultimately developed enough inertia to become a de facto
part of the coding culture. They moved out of the restrooms and into the mainstream.
Keith Sawyer, a professor of psychology and education at Washington University in St. Louis, MO, has written widely on collaboration and innovation. In his study of jazz
performances, Keith Sawyer made this observation, “The group has the ideas, not the individual musicians.” Some of the most famous products were born out of this mosh
pit of interaction — in contrast to the romantic idea of a lone working genius driving change. According to Sawyer, more often than not, true innovation emerges from an
improvised process and draws from trial-by-error and many inputs.
Unexpected insights emerge from the group dynamic. If increasing interaction among different peer groups within a single company could lead to promising results,
imagine the possibilities for solopreneurs, small businesses, and indie workers — if only they could reach similar levels of peer access as those experienced by their bigger
counterparts. It is this potential that coworking tries to capture for its members.
According to the text, all the reasons below are benefits that support the choice of a collaborative workplace, EXCEPT:
www.tecconcursos.com.br/questoes/135963
CESGRANRIO - PB (BNDES)/BNDES/Psicologia/2013
Língua Inglesa (Inglês) - Interpretação de Textos (Understanding)
136) Coworking: Sharing How We Work
Genevieve DeGuzman
Communication
In the past, when trying to find places to work, independent workers, small businesses, and organizations often had to choose between several scenarios, all with their
attendant advantages and disadvantages: working from home; working from a coffee shop, library, or other public venue; or leasing an executive suite or other
commercial space.
Coworking takes freelancers, indie workers, and entrepreneurs who feel that they have been dormant or isolated working alone at home or who have been migrating from
a coffee shop to a friend’s garage or languishing in a sterile business center — to a space where they can truly roost.
“We can come out of hiding,” a coworker tells us, “and be in a space that’s comfortable, friendly, and has an aesthetic appeal that’s a far cry from the typical cookie-cutter
office environment.”
For many, it might be puzzling to pay for a wellequipped space teeming with other people, even with the chance of free coffee and inspiration. You might ask yourself,
“Well, why pay for a place to work when I’m perfectly comfortable at home and paying nothing?” Or, “Isn’t the whole point of telecommuting or starting my own business a
chance to avoid ‘going to the office’?”
Coworking may sound like an unnecessary expense, but let’s consider what you get from being a part of the space.
At its most basic level, coworking is the phenomenon of workers coming together in a shared or collaborative workspace for one or more of these reasons: to reduce costs
by having shared facilities and equipment, to access a community of fellow entrepreneurs, and to seek out collaboration within and across fields. Coworking spaces offer
an exciting alternative for people longing to escape the confines of their cubicle walls, the isolation of working solo at home, or the inconveniences of public venues.
The benefits and cost-savings in productivity and overall happiness and well-being reaped from coworking are also potentially huge. Enthusiasm and creativity become
contagious and multiply when you diversify your work environment with people from different fields or backgrounds. At coworking spaces, members pass each other
during the day, conversations get going, and miraculously idea-fusion happens with everyone benefitting from the shared thinking and brainstorming.
Differences matter. Coworking hinges on the belief that innovation and inspiration come from the cross-pollination of different people in different fields or specializations.
Random opportunities and discoveries that arise from interactions with others play a large role in coworking.
To see this in action on a large scale, think about Google. Google made the culture of sharing and collaboration in the workplace legend. It deployed “grouplets” for
initiatives that cover broader changes through the organization.
One remarkable story of a successful Google grouplet involved getting engineers to write their own testing code to reduce the incidence of bugs in software code. Thinking
creatively, the grouplet came up with a campaign based on posting episodes discussing new and interesting testing techniques on the bathroom stalls. “Testing on the
Toilet” spread fast and garnered both rants and raves. Soon, people were hungry for more, and the campaign ultimately developed enough inertia to become a de facto
part of the coding culture. They moved out of the restrooms and into the mainstream.
Keith Sawyer, a professor of psychology and education at Washington University in St. Louis, MO, has written widely on collaboration and innovation. In his study of jazz
performances, Keith Sawyer made this observation, “The group has the ideas, not the individual musicians.” Some of the most famous products were born out of this mosh
pit of interaction — in contrast to the romantic idea of a lone working genius driving change. According to Sawyer, more often than not, true innovation emerges from an
improvised process and draws from trial-by-error and many inputs.
Unexpected insights emerge from the group dynamic. If increasing interaction among different peer groups within a single company could lead to promising results,
imagine the possibilities for solopreneurs, small businesses, and indie workers — if only they could reach similar levels of peer access as those experienced by their bigger
counterparts. It is this potential that coworking tries to capture for its members.
a) contrast the legends on workplace productivity with Google’s large scale marketing initiatives.
b) argument with a counter-example to prove that coworking does not always bring about a successful result.
c) suggest that it is essential to campaign for new techniques that will foster inertia in the work environment.
d) illustrate how software engineers can find better solutions for bathroom installations.
e) demonstrate through example how workers in different specializations can collaborate to find innovative solutions for the business.
www.tecconcursos.com.br/questoes/135965
CESGRANRIO - PB (BNDES)/BNDES/Psicologia/2013
Língua Inglesa (Inglês) - Interpretação de Textos (Understanding)
137) Coworking: Sharing How We Work
Genevieve DeGuzman
Communication
In the past, when trying to find places to work, independent workers, small businesses, and organizations often had to choose between several scenarios, all with their
attendant advantages and disadvantages: working from home; working from a coffee shop, library, or other public venue; or leasing an executive suite or other
commercial space.
Coworking takes freelancers, indie workers, and entrepreneurs who feel that they have been dormant or isolated working alone at home or who have been migrating from
a coffee shop to a friend’s garage or languishing in a sterile business center — to a space where they can truly roost.
“We can come out of hiding,” a coworker tells us, “and be in a space that’s comfortable, friendly, and has an aesthetic appeal that’s a far cry from the typical cookie-cutter
office environment.”
For many, it might be puzzling to pay for a wellequipped space teeming with other people, even with the chance of free coffee and inspiration. You might ask yourself,
“Well, why pay for a place to work when I’m perfectly comfortable at home and paying nothing?” Or, “Isn’t the whole point of telecommuting or starting my own business a
chance to avoid ‘going to the office’?”
Coworking may sound like an unnecessary expense, but let’s consider what you get from being a part of the space.
At its most basic level, coworking is the phenomenon of workers coming together in a shared or collaborative workspace for one or more of these reasons: to reduce costs
by having shared facilities and equipment, to access a community of fellow entrepreneurs, and to seek out collaboration within and across fields. Coworking spaces offer
an exciting alternative for people longing to escape the confines of their cubicle walls, the isolation of working solo at home, or the inconveniences of public venues.
The benefits and cost-savings in productivity and overall happiness and well-being reaped from coworking are also potentially huge. Enthusiasm and creativity become
contagious and multiply when you diversify your work environment with people from different fields or backgrounds. At coworking spaces, members pass each other
during the day, conversations get going, and miraculously idea-fusion happens with everyone benefitting from the shared thinking and brainstorming.
Differences matter. Coworking hinges on the belief that innovation and inspiration come from the cross-pollination of different people in different fields or specializations.
Random opportunities and discoveries that arise from interactions with others play a large role in coworking.
To see this in action on a large scale, think about Google. Google made the culture of sharing and collaboration in the workplace legend. It deployed “grouplets” for
initiatives that cover broader changes through the organization.
One remarkable story of a successful Google grouplet involved getting engineers to write their own testing code to reduce the incidence of bugs in software code. Thinking
creatively, the grouplet came up with a campaign based on posting episodes discussing new and interesting testing techniques on the bathroom stalls. “Testing on the
Toilet” spread fast and garnered both rants and raves. Soon, people were hungry for more, and the campaign ultimately developed enough inertia to become a de facto
part of the coding culture. They moved out of the restrooms and into the mainstream.
Keith Sawyer, a professor of psychology and education at Washington University in St. Louis, MO, has written widely on collaboration and innovation. In his study of jazz
performances, Keith Sawyer made this observation, “The group has the ideas, not the individual musicians.” Some of the most famous products were born out of this mosh
pit of interaction — in contrast to the romantic idea of a lone working genius driving change. According to Sawyer, more often than not, true innovation emerges from an
improvised process and draws from trial-by-error and many inputs.
Unexpected insights emerge from the group dynamic. If increasing interaction among different peer groups within a single company could lead to promising results,
imagine the possibilities for solopreneurs, small businesses, and indie workers — if only they could reach similar levels of peer access as those experienced by their bigger
counterparts. It is this potential that coworking tries to capture for its members.
Professor Keith Sawyer mentions that “The group has the ideas, not the individual musicians.” to mean that
a) the dispute among consumers is the key to profitable product-design changes.
b) the famous products result from professionals working individually to achieve the aims of the group.
c) improvisation and trial-and-error always leads to the best solutions for the market place.
d) good jazz performances are made up of individual musicians who strive to play their instruments far louder than the others.
e) it is the whole orchestra that makes the music sound pleasant just as it is the whole professional team that will achieve a successful solution.
www.tecconcursos.com.br/questoes/135967
CESGRANRIO - PB (BNDES)/BNDES/Psicologia/2013
Língua Inglesa (Inglês) - Interpretação de Textos (Understanding)
138) Coworking: Sharing How We Work
Genevieve DeGuzman
Communication
Coworking takes freelancers, indie workers, and entrepreneurs who feel that they have been dormant or isolated working alone at home or who have been migrating from
a coffee shop to a friend’s garage or languishing in a sterile business center — to a space where they can truly roost.
“We can come out of hiding,” a coworker tells us, “and be in a space that’s comfortable, friendly, and has an aesthetic appeal that’s a far cry from the typical cookie-cutter
office environment.”
For many, it might be puzzling to pay for a wellequipped space teeming with other people, even with the chance of free coffee and inspiration. You might ask yourself,
“Well, why pay for a place to work when I’m perfectly comfortable at home and paying nothing?” Or, “Isn’t the whole point of telecommuting or starting my own business a
chance to avoid ‘going to the office’?”
Coworking may sound like an unnecessary expense, but let’s consider what you get from being a part of the space.
At its most basic level, coworking is the phenomenon of workers coming together in a shared or collaborative workspace for one or more of these reasons: to reduce costs
by having shared facilities and equipment, to access a community of fellow entrepreneurs, and to seek out collaboration within and across fields. Coworking spaces offer
an exciting alternative for people longing to escape the confines of their cubicle walls, the isolation of working solo at home, or the inconveniences of public venues.
The benefits and cost-savings in productivity and overall happiness and well-being reaped from coworking are also potentially huge. Enthusiasm and creativity become
contagious and multiply when you diversify your work environment with people from different fields or backgrounds. At coworking spaces, members pass each other
during the day, conversations get going, and miraculously idea-fusion happens with everyone benefitting from the shared thinking and brainstorming.
Differences matter. Coworking hinges on the belief that innovation and inspiration come from the cross-pollination of different people in different fields or specializations.
Random opportunities and discoveries that arise from interactions with others play a large role in coworking.
To see this in action on a large scale, think about Google. Google made the culture of sharing and collaboration in the workplace legend. It deployed “grouplets” for
initiatives that cover broader changes through the organization.
One remarkable story of a successful Google grouplet involved getting engineers to write their own testing code to reduce the incidence of bugs in software code. Thinking
creatively, the grouplet came up with a campaign based on posting episodes discussing new and interesting testing techniques on the bathroom stalls. “Testing on the
Toilet” spread fast and garnered both rants and raves. Soon, people were hungry for more, and the campaign ultimately developed enough inertia to become a de facto
part of the coding culture. They moved out of the restrooms and into the mainstream.
Keith Sawyer, a professor of psychology and education at Washington University in St. Louis, MO, has written widely on collaboration and innovation. In his study of jazz
performances, Keith Sawyer made this observation, “The group has the ideas, not the individual musicians.” Some of the most famous products were born out of this mosh
pit of interaction — in contrast to the romantic idea of a lone working genius driving change. According to Sawyer, more often than not, true innovation emerges from an
improvised process and draws from trial-by-error and many inputs.
Unexpected insights emerge from the group dynamic. If increasing interaction among different peer groups within a single company could lead to promising results,
imagine the possibilities for solopreneurs, small businesses, and indie workers — if only they could reach similar levels of peer access as those experienced by their bigger
counterparts. It is this potential that coworking tries to capture for its members.
The only one which can be considered as an argument against coworking is:
a) ‘One of the best things is that I pay lower than I would for a dedicated office, so I don’t feel pressured to go to the coworking facility every day.’
b) ‘Though my home office is great and I love it, I sometimes need the distance and collaborative environment that my coworking space provides.’
c) ‘The vibe of being around others can feel like a wave carrying you even when you’re not sure where to go – if you need a little social boost.’
d) ‘Perhaps you won’t like any of the other people at your coworking space, or that the proprietors aren’t putting much effort into socializing or collaboration.’
e) ‘The shared space provides instant community and a stimulating atmosphere around other professionals working towards the same intentions as I am.’
www.tecconcursos.com.br/questoes/136234
Education advocates in Latin America have long pushed for expanded access for all students. Indeed, access has improved, with secondary school completion rates
climbing from 30 to 50 percent over the past two decades. However, there is a growing realization that greater access alone will do little good without higher quality.
Business leaders, in particular, have argued that there is a profound disconnect between what schools are teaching and what is actually required for a worker to succeed in
a globalized, innovation-driven, and knowledge-based modern economy. “There are very talented people in the region. All they need is a chance to develop,” says Felipe
Vergara, co-founder of Lumni, a company that invests in students’ education in exchange for a fixed portion of the income they will go on to receive with their improved
career path.
At the same time that the private sector is beginning to take matters into its own hands, a new report from a team of Inter-American Development Bank education
researchers, led by Marina Bassi and Jaime Vargas, has shed new light on the failures of Latin American education systems to prepare students for the job market. Entitled
“Disconnected: Skills, Education and Employment in Latin America”, the report uses surveys of both students and employers across the region to understand why and how
this gap in skills is occurring.
The results are surprising. While access has increased, in two other critical areas - quality and relevance - there has been little to no progress, leaving students
unprepared for the demands of the modern workplace. The employers surveyed all pointed to the importance of what are known as “socio-emotional skills”, in contrast to
traditional cognitive skills such as literacy and basic mathematics. Socio-emotional skills relate to personality, and include punctuality, politeness, work ethics, responsibility,
empathy, and adaptability, and are especially critical for workers and managers in a globalized economy defined by its unpredictability and dynamism.
Companies have a strong role to play, and some of them are just not giving up. As Juan Iramain, Vice President of Public Affairs and Communications in Coca Cola’s South
Latin region, puts it, “at the Coca-Cola Company we understand that in order to catch up with the necessary level of sustainability of the globalized world, our business
should rely on the sustainability of the communities in which we operate. For some time now, therefore, we have been dealing with specialized NGOs to strengthen the
work of parents and school. The aim is not only for students to complete the school year, but also that they incorporate the curiosity and lifelong learning capabilities
needed to work in the labor market of the 21st century. We just can’t put up with a school program that cannot prepare youngsters for a better society”.
But above all, as the authors Marina Bassi and Jaime Vargas have argued, we must continue this dialogue between governments and the private sector so that education
reform can lead to increased opportunity and economic development across the region.
a) have reason to suppose that secondary education problems have all ended.
b) have reason to suppose that secondary education problems with quality have improved.
c) can be happy because education quality rate has climbed over 30 percent.
d) could be happy concerning students’ access to secondary school and completion of the course.
e) should be very concerned with the poor rate of access to secondary school.
www.tecconcursos.com.br/questoes/136236
Education advocates in Latin America have long pushed for expanded access for all students. Indeed, access has improved, with secondary school completion rates
climbing from 30 to 50 percent over the past two decades. However, there is a growing realization that greater access alone will do little good without higher quality.
Business leaders, in particular, have argued that there is a profound disconnect between what schools are teaching and what is actually required for a worker to succeed in
a globalized, innovation-driven, and knowledge-based modern economy. “There are very talented people in the region. All they need is a chance to develop,” says Felipe
Vergara, co-founder of Lumni, a company that invests in students’ education in exchange for a fixed portion of the income they will go on to receive with their improved
career path.
At the same time that the private sector is beginning to take matters into its own hands, a new report from a team of Inter-American Development Bank education
researchers, led by Marina Bassi and Jaime Vargas, has shed new light on the failures of Latin American education systems to prepare students for the job market. Entitled
“Disconnected: Skills, Education and Employment in Latin America”, the report uses surveys of both students and employers across the region to understand why and how
this gap in skills is occurring.
The results are surprising. While access has increased, in two other critical areas - quality and relevance - there has been little to no progress, leaving students
unprepared for the demands of the modern workplace. The employers surveyed all pointed to the importance of what are known as “socio-emotional skills”, in contrast to
traditional cognitive skills such as literacy and basic mathematics. Socio-emotional skills relate to personality, and include punctuality, politeness, work ethics, responsibility,
empathy, and adaptability, and are especially critical for workers and managers in a globalized economy defined by its unpredictability and dynamism.
While high costs are certainly playing a role, it is clear that addressing the skills gap in Latin America will require a multifaceted approach. As the authors of “Disconnected”
argue, schools must find ways to become more engaged with the productive economy that surrounds them, and improve their ability to instill and evaluate the type of skills
that the private sector is looking for. This effort should go beyond increasing the access and completion of secondary school. It should involve more research, better
teacher recruitment and evaluation, and incentives for developing socioemotional skills.
Companies have a strong role to play, and some of them are just not giving up. As Juan Iramain, Vice President of Public Affairs and Communications in Coca Cola’s South
Latin region, puts it, “at the Coca-Cola Company we understand that in order to catch up with the necessary level of sustainability of the globalized world, our business
should rely on the sustainability of the communities in which we operate. For some time now, therefore, we have been dealing with specialized NGOs to strengthen the
work of parents and school. The aim is not only for students to complete the school year, but also that they incorporate the curiosity and lifelong learning capabilities
needed to work in the labor market of the 21st century. We just can’t put up with a school program that cannot prepare youngsters for a better society”.
But above all, as the authors Marina Bassi and Jaime Vargas have argued, we must continue this dialogue between governments and the private sector so that education
reform can lead to increased opportunity and economic development across the region.
The failures of Latin American education systems have been pointed out by
a) students
b) the job market
c) the private sector business
d) a team of education researchers
e) business leaders such as Marina Bassi and Jaime Vargas
www.tecconcursos.com.br/questoes/136238
Education advocates in Latin America have long pushed for expanded access for all students. Indeed, access has improved, with secondary school completion rates
climbing from 30 to 50 percent over the past two decades. However, there is a growing realization that greater access alone will do little good without higher quality.
Business leaders, in particular, have argued that there is a profound disconnect between what schools are teaching and what is actually required for a worker to succeed in
a globalized, innovation-driven, and knowledge-based modern economy. “There are very talented people in the region. All they need is a chance to develop,” says Felipe
Vergara, co-founder of Lumni, a company that invests in students’ education in exchange for a fixed portion of the income they will go on to receive with their improved
career path.
At the same time that the private sector is beginning to take matters into its own hands, a new report from a team of Inter-American Development Bank education
researchers, led by Marina Bassi and Jaime Vargas, has shed new light on the failures of Latin American education systems to prepare students for the job market. Entitled
“Disconnected: Skills, Education and Employment in Latin America”, the report uses surveys of both students and employers across the region to understand why and how
this gap in skills is occurring.
The results are surprising. While access has increased, in two other critical areas - quality and relevance - there has been little to no progress, leaving students
unprepared for the demands of the modern workplace. The employers surveyed all pointed to the importance of what are known as “socio-emotional skills”, in contrast to
traditional cognitive skills such as literacy and basic mathematics. Socio-emotional skills relate to personality, and include punctuality, politeness, work ethics, responsibility,
empathy, and adaptability, and are especially critical for workers and managers in a globalized economy defined by its unpredictability and dynamism.
While high costs are certainly playing a role, it is clear that addressing the skills gap in Latin America will require a multifaceted approach. As the authors of “Disconnected”
argue, schools must find ways to become more engaged with the productive economy that surrounds them, and improve their ability to instill and evaluate the type of skills
that the private sector is looking for. This effort should go beyond increasing the access and completion of secondary school. It should involve more research, better
teacher recruitment and evaluation, and incentives for developing socioemotional skills.
Companies have a strong role to play, and some of them are just not giving up. As Juan Iramain, Vice President of Public Affairs and Communications in Coca Cola’s South
Latin region, puts it, “at the Coca-Cola Company we understand that in order to catch up with the necessary level of sustainability of the globalized world, our business
should rely on the sustainability of the communities in which we operate. For some time now, therefore, we have been dealing with specialized NGOs to strengthen the
work of parents and school. The aim is not only for students to complete the school year, but also that they incorporate the curiosity and lifelong learning capabilities
needed to work in the labor market of the 21st century. We just can’t put up with a school program that cannot prepare youngsters for a better society”.
But above all, as the authors Marina Bassi and Jaime Vargas have argued, we must continue this dialogue between governments and the private sector so that education
reform can lead to increased opportunity and economic development across the region.
www.tecconcursos.com.br/questoes/136242
Education advocates in Latin America have long pushed for expanded access for all students. Indeed, access has improved, with secondary school completion rates
climbing from 30 to 50 percent over the past two decades. However, there is a growing realization that greater access alone will do little good without higher quality.
Business leaders, in particular, have argued that there is a profound disconnect between what schools are teaching and what is actually required for a worker to succeed in
a globalized, innovation-driven, and knowledge-based modern economy. “There are very talented people in the region. All they need is a chance to develop,” says Felipe
Vergara, co-founder of Lumni, a company that invests in students’ education in exchange for a fixed portion of the income they will go on to receive with their improved
career path.
At the same time that the private sector is beginning to take matters into its own hands, a new report from a team of Inter-American Development Bank education
researchers, led by Marina Bassi and Jaime Vargas, has shed new light on the failures of Latin American education systems to prepare students for the job market. Entitled
“Disconnected: Skills, Education and Employment in Latin America”, the report uses surveys of both students and employers across the region to understand why and how
this gap in skills is occurring.
The results are surprising. While access has increased, in two other critical areas - quality and relevance - there has been little to no progress, leaving students
unprepared for the demands of the modern workplace. The employers surveyed all pointed to the importance of what are known as “socio-emotional skills”, in contrast to
traditional cognitive skills such as literacy and basic mathematics. Socio-emotional skills relate to personality, and include punctuality, politeness, work ethics, responsibility,
empathy, and adaptability, and are especially critical for workers and managers in a globalized economy defined by its unpredictability and dynamism.
While high costs are certainly playing a role, it is clear that addressing the skills gap in Latin America will require a multifaceted approach. As the authors of “Disconnected”
argue, schools must find ways to become more engaged with the productive economy that surrounds them, and improve their ability to instill and evaluate the type of skills
that the private sector is looking for. This effort should go beyond increasing the access and completion of secondary school. It should involve more research, better
teacher recruitment and evaluation, and incentives for developing socioemotional skills.
Companies have a strong role to play, and some of them are just not giving up. As Juan Iramain, Vice President of Public Affairs and Communications in Coca Cola’s South
Latin region, puts it, “at the Coca-Cola Company we understand that in order to catch up with the necessary level of sustainability of the globalized world, our business
should rely on the sustainability of the communities in which we operate. For some time now, therefore, we have been dealing with specialized NGOs to strengthen the
work of parents and school. The aim is not only for students to complete the school year, but also that they incorporate the curiosity and lifelong learning capabilities
But above all, as the authors Marina Bassi and Jaime Vargas have argued, we must continue this dialogue between governments and the private sector so that education
reform can lead to increased opportunity and economic development across the region.
What measure has not proven sufficient in the past to address the skills gap in Latin American Education?
www.tecconcursos.com.br/questoes/136246
Education advocates in Latin America have long pushed for expanded access for all students. Indeed, access has improved, with secondary school completion rates
climbing from 30 to 50 percent over the past two decades. However, there is a growing realization that greater access alone will do little good without higher quality.
Business leaders, in particular, have argued that there is a profound disconnect between what schools are teaching and what is actually required for a worker to succeed in
a globalized, innovation-driven, and knowledge-based modern economy. “There are very talented people in the region. All they need is a chance to develop,” says Felipe
Vergara, co-founder of Lumni, a company that invests in students’ education in exchange for a fixed portion of the income they will go on to receive with their improved
career path.
At the same time that the private sector is beginning to take matters into its own hands, a new report from a team of Inter-American Development Bank education
researchers, led by Marina Bassi and Jaime Vargas, has shed new light on the failures of Latin American education systems to prepare students for the job market. Entitled
“Disconnected: Skills, Education and Employment in Latin America”, the report uses surveys of both students and employers across the region to understand why and how
this gap in skills is occurring.
The results are surprising. While access has increased, in two other critical areas - quality and relevance - there has been little to no progress, leaving students
unprepared for the demands of the modern workplace. The employers surveyed all pointed to the importance of what are known as “socio-emotional skills”, in contrast to
traditional cognitive skills such as literacy and basic mathematics. Socio-emotional skills relate to personality, and include punctuality, politeness, work ethics, responsibility,
empathy, and adaptability, and are especially critical for workers and managers in a globalized economy defined by its unpredictability and dynamism.
While high costs are certainly playing a role, it is clear that addressing the skills gap in Latin America will require a multifaceted approach. As the authors of “Disconnected”
argue, schools must find ways to become more engaged with the productive economy that surrounds them, and improve their ability to instill and evaluate the type of skills
that the private sector is looking for. This effort should go beyond increasing the access and completion of secondary school. It should involve more research, better
teacher recruitment and evaluation, and incentives for developing socioemotional skills.
Companies have a strong role to play, and some of them are just not giving up. As Juan Iramain, Vice President of Public Affairs and Communications in Coca Cola’s South
Latin region, puts it, “at the Coca-Cola Company we understand that in order to catch up with the necessary level of sustainability of the globalized world, our business
should rely on the sustainability of the communities in which we operate. For some time now, therefore, we have been dealing with specialized NGOs to strengthen the
work of parents and school. The aim is not only for students to complete the school year, but also that they incorporate the curiosity and lifelong learning capabilities
needed to work in the labor market of the 21st century. We just can’t put up with a school program that cannot prepare youngsters for a better society”.
But above all, as the authors Marina Bassi and Jaime Vargas have argued, we must continue this dialogue between governments and the private sector so that education
reform can lead to increased opportunity and economic development across the region.
According to the text, the ultimate solution to better education proposed by the authors of the “Disconnected” Report is to
www.tecconcursos.com.br/questoes/224152
Most birds and other animals migrate for three basic reasons. First, animals must look for food, and maybe they may have depleted the resources in a particular area
where they are. Or they may be trying to keep up with the changing patterns of available vegetation. This is what the zebras in Serengeti Forest in Africa do each year.
They follow rainfall patterns in order to find ample and fresh vegetation.
Second, animals may migrate to escape extreme seasonal temperatures. For example, many birds fly south to warmer climates, for the winter, while others travel to get
special seasonal shelters. Little brown bats fly 200 to 800 km from their outdoor home to their winter caves that provide a safe place for them to hibernate during the cold
months.
Third, animals migrate to get to their breeding ground. Salmon, for instance, swim from the ocean to the exact area of the river where they were born. Then, they
reproduce in this area before they die.
Now, in terms of how they migrate, if we are talking about day animals, they keep visual landmarks in their minds to help them move from one place to another. With the
elephants, for instance, the oldest female memorizes the position of rivers, mountain ranges and other fixed spots, in order to take the members of the herd to food and
fresh water. Flying animals such as birds and insects may use the position of the sun to guide them as a compass. If they are night animals, their specialized nocturnal
vision allows them to see the shades of the trees in the dark. But to migrate at night in the forest, night animals’ preferred method is by guiding themselves by observing
the position of the stars, just like exploring navigators used to do in the XV century.
Available at: <http://www.kidsdiscover.com/blog/spotlight/animalmigrations- for-kids/> Retrieved on: nov. 20, 2012. Adapted.
In the text, the first reason given for an animal to migrate is its
a) desire for a particular area.
b) search for food.
c) need to get protection from the rain.
d) necessity to deplete the resources.
e) urge to follow other animals.
www.tecconcursos.com.br/questoes/224154
Most birds and other animals migrate for three basic reasons. First, animals must look for food, and maybe they may have depleted the resources in a particular area
where they are. Or they may be trying to keep up with the changing patterns of available vegetation. This is what the zebras in Serengeti Forest in Africa do each year.
They follow rainfall patterns in order to find ample and fresh vegetation.
Second, animals may migrate to escape extreme seasonal temperatures. For example, many birds fly south to warmer climates, for the winter, while others travel to get
special seasonal shelters. Little brown bats fly 200 to 800 km from their outdoor home to their winter caves that provide a safe place for them to hibernate during the cold
months.
Third, animals migrate to get to their breeding ground. Salmon, for instance, swim from the ocean to the exact area of the river where they were born. Then, they
reproduce in this area before they die.
Now, in terms of how they migrate, if we are talking about day animals, they keep visual landmarks in their minds to help them move from one place to another. With the
elephants, for instance, the oldest female memorizes the position of rivers, mountain ranges and other fixed spots, in order to take the members of the herd to food and
fresh water. Flying animals such as birds and insects may use the position of the sun to guide them as a compass. If they are night animals, their specialized nocturnal
vision allows them to see the shades of the trees in the dark. But to migrate at night in the forest, night animals’ preferred method is by guiding themselves by observing
the position of the stars, just like exploring navigators used to do in the XV century.
Available at: <http://www.kidsdiscover.com/blog/spotlight/animalmigrations- for-kids/> Retrieved on: nov. 20, 2012. Adapted.
www.tecconcursos.com.br/questoes/224155
Most birds and other animals migrate for three basic reasons. First, animals must look for food, and maybe they may have depleted the resources in a particular area
where they are. Or they may be trying to keep up with the changing patterns of available vegetation. This is what the zebras in Serengeti Forest in Africa do each year.
They follow rainfall patterns in order to find ample and fresh vegetation.
Second, animals may migrate to escape extreme seasonal temperatures. For example, many birds fly south to warmer climates, for the winter, while others travel to get
special seasonal shelters. Little brown bats fly 200 to 800 km from their outdoor home to their winter caves that provide a safe place for them to hibernate during the cold
Third, animals migrate to get to their breeding ground. Salmon, for instance, swim from the ocean to the exact area of the river where they were born. Then, they
reproduce in this area before they die.
Now, in terms of how they migrate, if we are talking about day animals, they keep visual landmarks in their minds to help them move from one place to another. With the
elephants, for instance, the oldest female memorizes the position of rivers, mountain ranges and other fixed spots, in order to take the members of the herd to food and
fresh water. Flying animals such as birds and insects may use the position of the sun to guide them as a compass. If they are night animals, their specialized nocturnal
vision allows them to see the shades of the trees in the dark. But to migrate at night in the forest, night animals’ preferred method is by guiding themselves by observing
the position of the stars, just like exploring navigators used to do in the XV century.
Available at: <http://www.kidsdiscover.com/blog/spotlight/animalmigrations- for-kids/> Retrieved on: nov. 20, 2012. Adapted.
www.tecconcursos.com.br/questoes/224156
Most birds and other animals migrate for three basic reasons. First, animals must look for food, and maybe they may have depleted the resources in a particular area
where they are. Or they may be trying to keep up with the changing patterns of available vegetation. This is what the zebras in Serengeti Forest in Africa do each year.
They follow rainfall patterns in order to find ample and fresh vegetation.
Second, animals may migrate to escape extreme seasonal temperatures. For example, many birds fly south to warmer climates, for the winter, while others travel to get
special seasonal shelters. Little brown bats fly 200 to 800 km from their outdoor home to their winter caves that provide a safe place for them to hibernate during the cold
months.
Third, animals migrate to get to their breeding ground. Salmon, for instance, swim from the ocean to the exact area of the river where they were born. Then, they
reproduce in this area before they die.
Now, in terms of how they migrate, if we are talking about day animals, they keep visual landmarks in their minds to help them move from one place to another. With the
elephants, for instance, the oldest female memorizes the position of rivers, mountain ranges and other fixed spots, in order to take the members of the herd to food and
fresh water. Flying animals such as birds and insects may use the position of the sun to guide them as a compass. If they are night animals, their specialized nocturnal
vision allows them to see the shades of the trees in the dark. But to migrate at night in the forest, night animals’ preferred method is by guiding themselves by observing
the position of the stars, just like exploring navigators used to do in the XV century.
Available at: <http://www.kidsdiscover.com/blog/spotlight/animalmigrations- for-kids/> Retrieved on: nov. 20, 2012. Adapted.
Concerning night animals in the forest, according to the text, their main migration method is by
www.tecconcursos.com.br/questoes/297103
Despite discussion to the contrary, the best available economic evidence suggests thatimmigration expands the economic opportunities andincomes of Americans and helps
reduce the budget deficit.
Recent research suggests that immigration raises wages and lowers prices for consumers throughout the economy. For American business owners, immigrants are both
new sources of customers and employees, helping to expand production using American resources and know-how in sectors ranging from farming to technology. For
American workers,the data suggest that rather than competing for identical jobs, immigrants tend to work alongside and in support of American workers, creating more
and better job opportunities.
Results from recent cutting-edge economics research on the impact of immigration on wages show small but positive effects of immigration on American wages as a
whole. The evidence becomes more mixed, though, when looking at specific groups of workers. While some studies show large negative impacts of immigration on low-
skill workers, other estimates find that immigration raises the wages of all US workers, regardless of education. As further evidence supporting the second set of findings,
one study that examines a period of rapid immigration finds that immigrants do not cause declines in wages, even among less-skilled residents.
Most studies also find that over time immigrants improve the finances of programs like Social Security and can actually help reduce the budget deficit.
These potential additional boosts to economic growth are not necessary to make a case for more immigration. The evidence on the direct effects of immigration — higher
wages, lower prices and net taxes — shows that immigration raises standards of living for Americans.
Text II
The experience of field research in LA while living in the US gave me two insights in support of the thesis defended by the researchers.
First, even poor campesinos from El Salvador can prosper in the US. They send their kids to school, learn English as a second language, start a small business or do work
shunned by Americans.
The question is why a poor El Salvadorean can become a valuable citizen in the US and not in his native country? The US economic and social systems are set up to
provide opportunity for immigrants to prosper. Immigration is the engine of growth and prosperity of the American economy.
The second argument is counter factual. Countries closed to immigration lag behind those opened to foreign skill and knowledge. Take the case of Brazil. In the 19th
century, many predicted Brazil would become a world power along with the US.
The US became a major world superpower and Brazil continues to be an emerging market with a sub par educational system and illiterate population. There are many
reasons and factors that could explain Brazil’s backwardness. One, however, stands out. The country is closed to immigration, even badly needed high skilled foreign
professionals in dynamic sectors of the economy.
The Brazilian economy in 2013 is stagnated with the lowest rate of labor productivity among the BRICS. Lack of qualified foreign workers + poor quality of schools are the
MAIN factor preventing Brazil to become a developed country in this century.
2. April 17, 2013 at 9:42 a.m., Dover - NJ - USA Comment sent by T. McK.
I really wish these writers would look at real jobs and real industries. However the data looks overall, certain jobs that were once routinely done by lower middle class
workers, such as gardening, waiting at table, construction labor and so on, are almost all done by immigrants, especially illegals. And part of the reason is the poor
enforcement of wage laws, and the existence of a cash economy. It may be that these jobs are now forever changed, but since we have such poor opportunities for the
working class, it seems a shame to lose a class of work that had formerly been available.
For decades now, the elites (economists and social thinkers of all sorts) have told us that globalization will bring benefits. And it has, to them. But we have lost much of
what provided a way of life for working folks, each time promising them that it will get better.
3. April 17, 2013 at 9:22 a.m., Dayton - Ohio - USA Comment sent by J. I.
I don’t see how the authors’ data support their case, in large part because they’ve neglected a critical issue-- precisely what kind of immigration are we talking about?
If immigration law requires that immigrants be paid a fair wage, have the right to vote and enjoy legal protections against abusive workplaces, and these are truly
enforced, then yes, it’s reasonable to expect that immigrants would indeed boost living standards for both native-born and immigrant Americans alike.
But if immigrants are instead brought in as lowwage replacements for American workers, not allowed the right to vote or forced to ten or more years to gain it, and
especially if employers have control over their visas and work situations, then living standards are severely damaged for both immigrants and nativeborn Americans, that is
for everyone but the 0.1% wealthiest Americans who benefit from cheap labor.
www.tecconcursos.com.br/questoes/297104
Despite discussion to the contrary, the best available economic evidence suggests thatimmigration expands the economic opportunities andincomes of Americans and helps
reduce the budget deficit.
Recent research suggests that immigration raises wages and lowers prices for consumers throughout the economy. For American business owners, immigrants are both
new sources of customers and employees, helping to expand production using American resources and know-how in sectors ranging from farming to technology. For
American workers,the data suggest that rather than competing for identical jobs, immigrants tend to work alongside and in support of American workers, creating more
and better job opportunities.
Results from recent cutting-edge economics research on the impact of immigration on wages show small but positive effects of immigration on American wages as a
whole. The evidence becomes more mixed, though, when looking at specific groups of workers. While some studies show large negative impacts of immigration on low-
skill workers, other estimates find that immigration raises the wages of all US workers, regardless of education. As further evidence supporting the second set of findings,
one study that examines a period of rapid immigration finds that immigrants do not cause declines in wages, even among less-skilled residents.
Most studies also find that over time immigrants improve the finances of programs like Social Security and can actually help reduce the budget deficit.
And these are only the direct measured effects of immigration on individual wages, employment and the budget. Immigrants, particularly higher-skilled immigrants, start
more businesses and participate in scientific and other research at higher rates than native-born Americans. These other findings hint at additional potential benefits of
more immigration, including increases in innovation that could help boost overall economic growth. The high fraction of innovative Silicon Valley start-ups founded by
immigrants are an important example of this point.
These potential additional boosts to economic growth are not necessary to make a case for more immigration. The evidence on the direct effects of immigration — higher
wages, lower prices and net taxes — shows that immigration raises standards of living for Americans.
Text II
The experience of field research in LA while living in the US gave me two insights in support of the thesis defended by the researchers.
First, even poor campesinos from El Salvador can prosper in the US. They send their kids to school, learn English as a second language, start a small business or do work
shunned by Americans.
The question is why a poor El Salvadorean can become a valuable citizen in the US and not in his native country? The US economic and social systems are set up to
provide opportunity for immigrants to prosper. Immigration is the engine of growth and prosperity of the American economy.
The second argument is counter factual. Countries closed to immigration lag behind those opened to foreign skill and knowledge. Take the case of Brazil. In the 19th
century, many predicted Brazil would become a world power along with the US.
The US became a major world superpower and Brazil continues to be an emerging market with a sub par educational system and illiterate population. There are many
reasons and factors that could explain Brazil’s backwardness. One, however, stands out. The country is closed to immigration, even badly needed high skilled foreign
professionals in dynamic sectors of the economy.
The Brazilian economy in 2013 is stagnated with the lowest rate of labor productivity among the BRICS. Lack of qualified foreign workers + poor quality of schools are the
MAIN factor preventing Brazil to become a developed country in this century.
2. April 17, 2013 at 9:42 a.m., Dover - NJ - USA Comment sent by T. McK.
I really wish these writers would look at real jobs and real industries. However the data looks overall, certain jobs that were once routinely done by lower middle class
workers, such as gardening, waiting at table, construction labor and so on, are almost all done by immigrants, especially illegals. And part of the reason is the poor
enforcement of wage laws, and the existence of a cash economy. It may be that these jobs are now forever changed, but since we have such poor opportunities for the
working class, it seems a shame to lose a class of work that had formerly been available.
For decades now, the elites (economists and social thinkers of all sorts) have told us that globalization will bring benefits. And it has, to them. But we have lost much of
what provided a way of life for working folks, each time promising them that it will get better.
3. April 17, 2013 at 9:22 a.m., Dayton - Ohio - USA Comment sent by J. I.
I don’t see how the authors’ data support their case, in large part because they’ve neglected a critical issue-- precisely what kind of immigration are we talking about?
If immigration law requires that immigrants be paid a fair wage, have the right to vote and enjoy legal protections against abusive workplaces, and these are truly
enforced, then yes, it’s reasonable to expect that immigrants would indeed boost living standards for both native-born and immigrant Americans alike.
But if immigrants are instead brought in as lowwage replacements for American workers, not allowed the right to vote or forced to ten or more years to gain it, and
especially if employers have control over their visas and work situations, then living standards are severely damaged for both immigrants and nativeborn Americans, that is
for everyone but the 0.1% wealthiest Americans who benefit from cheap labor.
When relating the ideas in Text I with those in Text II, one concludes that the
a) author of Comment 1, U.N., has a view that is contrary to that manifested by the author of Text I in terms of a country’s economic standards.
b) author of Comment 2, T. McK, supports the argument on the relation between economic growth and foreign workforce exposed in Text I.
c) author of Comment 1, U.N., and the author of Comment 3, J.I., side with the author of Text I about immigration and economic development.
d) authors of Comments 2 and 3, T. McK and J.I., respectively, oppose the view on the relation between economic development and rates of immigration expressed
in Text I.
e) three commentators agree with the perspective on the importance of immigration defended by the author of Text I.
www.tecconcursos.com.br/questoes/199180
Here’s some news of workers sleeping on the job that’s downright scary. A news investigation produced a story and footage of air traffic controllers at Westchester County
Airport sleeping during their shifts. The video, provided to the news outlet by an employee in the air traffic control tower at Westchester Airport, also shows controllers
reading and using laptops and cell phones while on duty. The Federal Aviation Administration (FAA) bans its controllers from use of cell phones, personal reading material
and electric devices while on duty. Sleeping is prohibited anywhere in air traffic control towers.
All of these violations are alarming and dangerous, and pose a serious public safety problem. It is important, I believe, to separate the issue of air traffic controllers
Unfortunately this is not a new problem. We’ve seen several instances of air traffic controllers falling asleep on duty in recent months.
In response to these cases, the FAA in 2011 revised its regulations for air traffic controllers to include additional time for rest between shifts. The FAA:
Raised the minimum amount of time off between work shifts to 9 hours from 8 hours
Prohibited air traffic controllers from swapping shifts without having a minimum of 9 hours off in between shifts
Increased supervisor coverage in air traffic control towers during late night and early morning shifts
Prohibited air traffic controllers from picking up an overnight shift after a day off
These adjustments are a step in the right direction, but they don’t go far enough. Managing schedules for shift workers in these high-pressure jobs where public safety is
at stake is too important to settle for improvements that don’t actually solve the problem.
Shift workers of all types face challenges to getting enough sleep while managing long hours, overnight shifts, and changing schedules that fluctuate between day and
night. Research shows that:
People who engage in shift work get less sleep overall than those of us who work more regular hours
Shift workers are at higher risk for illness and chronic disease
The sleep deprivation associated with shift work increase the risk of accidents, injuries and mistakes in high-profile, public-safety related industries like medicine
and law enforcement, as well as air traffic control
In addition to making people more prone to accidents and injury, sleep deprivation causes a number of negative effects—both physical and
psychological—that can impair the on-the-job performance of air traffic controllers and other shift workers. Sleep deprivation:
I think we can all agree that we don’t want the people responsible for guiding our planes to be sluggish, slow-reacting, forgetful, fatigued and of questionable judgment.
But that’s exactly what being sleep deprived can make them!
It’s the FAAs responsibility to create workplace regulations that enable air traffic controllers to get the rest they need. This can include not just mandating reasonable time
off between shifts, but also giving controllers breaks during shifts and allowing them to nap on their breaks. There are also some basic things that the controllers
themselves—or any shift workers—can do to help avoid sleep deprivation:
Make sure to get adequate rest before a shift begins. Take a nap before work, if need be.
Limit your reliance on caffeine. While it’s okay as an occasional pick-me-up, coffee and caffeinated beverages are not substitutes for adequate sleep. And caffeine
can interfere with your sleep when you actually want and need to be sleepy.
Keep a strong and consistent sleep routine both during your workdays and your days off. It’s not always easy, but shift workers in particular need to build their off-
duty schedules around making sure they get the sleep they need.
Similarly to the recent changes in health care, the FAA is moving in the right direction to help its employees get the sleep they need to do their jobs safely. As this latest
incident at Westchester Airport confirms, there is a great deal of work still to be done. And it’s in everyone’s best—and safest—interest that progress continues to be
made.
Sweet Dreams,
www.tecconcursos.com.br/questoes/199181
Here’s some news of workers sleeping on the job that’s downright scary. A news investigation produced a story and footage of air traffic controllers at Westchester County
Airport sleeping during their shifts. The video, provided to the news outlet by an employee in the air traffic control tower at Westchester Airport, also shows controllers
reading and using laptops and cell phones while on duty. The Federal Aviation Administration (FAA) bans its controllers from use of cell phones, personal reading material
and electric devices while on duty. Sleeping is prohibited anywhere in air traffic control towers.
All of these violations are alarming and dangerous, and pose a serious public safety problem. It is important, I believe, to separate the issue of air traffic controllers
sleeping on the job from their choice to play with laptops and cell phones when they are supposed to be working. The video images showing air traffic controllers slumped
over and sleeping at their stations is truly frightening. But the issue of sleep deprivation among air traffic controllers is a very real one, and means that some instances of
falling asleep—however dangerous and wrong—is not entirely the controllers’ fault, or even within their control.
Unfortunately this is not a new problem. We’ve seen several instances of air traffic controllers falling asleep on duty in recent months.
In response to these cases, the FAA in 2011 revised its regulations for air traffic controllers to include additional time for rest between shifts. The FAA:
Raised the minimum amount of time off between work shifts to 9 hours from 8 hours
Prohibited air traffic controllers from swapping shifts without having a minimum of 9 hours off in between shifts
Increased supervisor coverage in air traffic control towers during late night and early morning shifts
Prohibited air traffic controllers from picking up an overnight shift after a day off
These adjustments are a step in the right direction, but they don’t go far enough. Managing schedules for shift workers in these high-pressure jobs where public safety is
at stake is too important to settle for improvements that don’t actually solve the problem.
Shift workers of all types face challenges to getting enough sleep while managing long hours, overnight shifts, and changing schedules that fluctuate between day and
night. Research shows that:
People who engage in shift work get less sleep overall than those of us who work more regular hours
Shift workers are at higher risk for illness and chronic disease
The sleep deprivation associated with shift work increase the risk of accidents, injuries and mistakes in high-profile, public-safety related industries like medicine
and law enforcement, as well as air traffic control
In addition to making people more prone to accidents and injury, sleep deprivation causes a number of negative effects—both physical and
psychological—that can impair the on-the-job performance of air traffic controllers and other shift workers. Sleep deprivation:
I think we can all agree that we don’t want the people responsible for guiding our planes to be sluggish, slow-reacting, forgetful, fatigued and of questionable judgment.
But that’s exactly what being sleep deprived can make them!
It’s the FAAs responsibility to create workplace regulations that enable air traffic controllers to get the rest they need. This can include not just mandating reasonable time
off between shifts, but also giving controllers breaks during shifts and allowing them to nap on their breaks. There are also some basic things that the controllers
themselves—or any shift workers—can do to help avoid sleep deprivation:
Make sure to get adequate rest before a shift begins. Take a nap before work, if need be.
Limit your reliance on caffeine. While it’s okay as an occasional pick-me-up, coffee and caffeinated beverages are not substitutes for adequate sleep. And caffeine
can interfere with your sleep when you actually want and need to be sleepy.
Keep a strong and consistent sleep routine both during your workdays and your days off. It’s not always easy, but shift workers in particular need to build their off-
duty schedules around making sure they get the sleep they need.
Similarly to the recent changes in health care, the FAA is moving in the right direction to help its employees get the sleep they need to do their jobs safely. As this latest
incident at Westchester Airport confirms, there is a great deal of work still to be done. And it’s in everyone’s best—and safest—interest that progress continues to be
made.
Sweet Dreams,
In Text , the expression downright scary in “Here’s some news of workers sleeping on the job that’s downright scary.” (line 1) can be replaced, without change in
meaning, by
a) faintly alarming
b) really encouraging
c) not at all terrifying
d) a little intimidating
e) absolutely frightening
www.tecconcursos.com.br/questoes/199182
Here’s some news of workers sleeping on the job that’s downright scary. A news investigation produced a story and footage of air traffic controllers at Westchester County
Airport sleeping during their shifts. The video, provided to the news outlet by an employee in the air traffic control tower at Westchester Airport, also shows controllers
reading and using laptops and cell phones while on duty. The Federal Aviation Administration (FAA) bans its controllers from use of cell phones, personal reading material
and electric devices while on duty. Sleeping is prohibited anywhere in air traffic control towers.
All of these violations are alarming and dangerous, and pose a serious public safety problem. It is important, I believe, to separate the issue of air traffic controllers
sleeping on the job from their choice to play with laptops and cell phones when they are supposed to be working. The video images showing air traffic controllers slumped
over and sleeping at their stations is truly frightening. But the issue of sleep deprivation among air traffic controllers is a very real one, and means that some instances of
falling asleep—however dangerous and wrong—is not entirely the controllers’ fault, or even within their control.
Unfortunately this is not a new problem. We’ve seen several instances of air traffic controllers falling asleep on duty in recent months.
In response to these cases, the FAA in 2011 revised its regulations for air traffic controllers to include additional time for rest between shifts. The FAA:
Raised the minimum amount of time off between work shifts to 9 hours from 8 hours
Prohibited air traffic controllers from swapping shifts without having a minimum of 9 hours off in between shifts
Increased supervisor coverage in air traffic control towers during late night and early morning shifts
Prohibited air traffic controllers from picking up an overnight shift after a day off
These adjustments are a step in the right direction, but they don’t go far enough. Managing schedules for shift workers in these high-pressure jobs where public safety is
at stake is too important to settle for improvements that don’t actually solve the problem.
Shift workers of all types face challenges to getting enough sleep while managing long hours, overnight shifts, and changing schedules that fluctuate between day and
night. Research shows that:
People who engage in shift work get less sleep overall than those of us who work more regular hours
Shift workers are at higher risk for illness and chronic disease
The sleep deprivation associated with shift work increase the risk of accidents, injuries and mistakes in high-profile, public-safety related industries like medicine
and law enforcement, as well as air traffic control
In addition to making people more prone to accidents and injury, sleep deprivation causes a number of negative effects—both physical and
psychological—that can impair the on-the-job performance of air traffic controllers and other shift workers. Sleep deprivation:
I think we can all agree that we don’t want the people responsible for guiding our planes to be sluggish, slow-reacting, forgetful, fatigued and of questionable judgment.
But that’s exactly what being sleep deprived can make them!
It’s the FAAs responsibility to create workplace regulations that enable air traffic controllers to get the rest they need. This can include not just mandating reasonable time
off between shifts, but also giving controllers breaks during shifts and allowing them to nap on their breaks. There are also some basic things that the controllers
themselves—or any shift workers—can do to help avoid sleep deprivation:
Make sure to get adequate rest before a shift begins. Take a nap before work, if need be.
Limit your reliance on caffeine. While it’s okay as an occasional pick-me-up, coffee and caffeinated beverages are not substitutes for adequate sleep. And caffeine
can interfere with your sleep when you actually want and need to be sleepy.
Keep a strong and consistent sleep routine both during your workdays and your days off. It’s not always easy, but shift workers in particular need to build their off-
duty schedules around making sure they get the sleep they need.
Similarly to the recent changes in health care, the FAA is moving in the right direction to help its employees get the sleep they need to do their jobs safely. As this latest
incident at Westchester Airport confirms, there is a great deal of work still to be done. And it’s in everyone’s best—and safest—interest that progress continues to be
made.
Sweet Dreams,
In the fragment of Text: “It is important, I believe, to separate the issue of air traffic controllers sleeping on the job from their choice to play with laptops and cell phones
when they are supposed to be working” (lines 6-7), Dr. Michael Breus implies that
a) falling asleep during the work shift is a far more serious violation of FAA policies because this is a behavior controllers cannot always be blamed for.
b) using laptops and cell phones in night shifts is a totally inoffensive behavior of air traffic controllers.
c) using electronic distractors at work during work shifts should not be punished when controllers are fighting off sleep.
d) playing with technological gadgets continuously at their working stations is justified if air traffic controllers are trying to avoid sleep.
e) applying penalties to air traffic controllers who sleep and use electronic devices while on duty is highly recommended.
www.tecconcursos.com.br/questoes/199183
Here’s some news of workers sleeping on the job that’s downright scary. A news investigation produced a story and footage of air traffic controllers at Westchester County
Airport sleeping during their shifts. The video, provided to the news outlet by an employee in the air traffic control tower at Westchester Airport, also shows controllers
reading and using laptops and cell phones while on duty. The Federal Aviation Administration (FAA) bans its controllers from use of cell phones, personal reading material
and electric devices while on duty. Sleeping is prohibited anywhere in air traffic control towers.
All of these violations are alarming and dangerous, and pose a serious public safety problem. It is important, I believe, to separate the issue of air traffic controllers
sleeping on the job from their choice to play with laptops and cell phones when they are supposed to be working. The video images showing air traffic controllers slumped
over and sleeping at their stations is truly frightening. But the issue of sleep deprivation among air traffic controllers is a very real one, and means that some instances of
falling asleep—however dangerous and wrong—is not entirely the controllers’ fault, or even within their control.
Unfortunately this is not a new problem. We’ve seen several instances of air traffic controllers falling asleep on duty in recent months.
In response to these cases, the FAA in 2011 revised its regulations for air traffic controllers to include additional time for rest between shifts. The FAA:
Raised the minimum amount of time off between work shifts to 9 hours from 8 hours
These adjustments are a step in the right direction, but they don’t go far enough. Managing schedules for shift workers in these high-pressure jobs where public safety is
at stake is too important to settle for improvements that don’t actually solve the problem.
Shift workers of all types face challenges to getting enough sleep while managing long hours, overnight shifts, and changing schedules that fluctuate between day and
night. Research shows that:
People who engage in shift work get less sleep overall than those of us who work more regular hours
Shift workers are at higher risk for illness and chronic disease
The sleep deprivation associated with shift work increase the risk of accidents, injuries and mistakes in high-profile, public safety related industries like medicine
and law enforcement, as well as air traffic control
In addition to making people more prone to accidents and injury, sleep deprivation causes a number of negative effects—both physical and psychological—that can impair
the on-the-job performance of air traffic controllers and other shift workers. Sleep deprivation:
I think we can all agree that we don’t want the people responsible for guiding our planes to be sluggish, slow-reacting, forgetful, fatigued and of questionable judgment.
But that’s exactly what being sleep deprived can make them!
It’s the FAAs responsibility to create workplace regulations that enable air traffic controllers to get the rest they need. This can include not just mandating reasonable time
off between shifts, but also giving controllers breaks during shifts and allowing them to nap on their breaks. There are also some basic things that the controllers
themselves—or any shift workers—can do to help avoid sleep deprivation:
Make sure to get adequate rest before a shift begins. Take a nap before work, if need be.
Limit your reliance on caffeine. While it’s okay as an occasional pick-me-up, coffee and caffeinated beverages are not substitutes for adequate sleep. And caffeine
can interfere with your sleep when you actually want and need to be sleepy.
Keep a strong and consistent sleep routine both during your workdays and your days off. It’s not always easy, but shift workers in particular need to build their off-
duty schedules around making sure they get the sleep they need.
Similarly to the recent changes in health care, the FAA is moving in the right direction to help its employees get the sleep they need to do their jobs safely. As this latest
incident at Westchester Airport confirms, there is a great deal of work still to be done. And it’s in everyone’s best—and safest—interest that progress continues to be
made.
Sweet Dreams,
According to Text, the Federal Aviation Administration, in response to serious public safety issues, has defined new norms that include all of the following, EXCEPT
a) include longer resting periods between shifts, which amount to at least 9 hours.
b) reinforce supervision of air control towers staff on duty in late night and early morning shifts.
c) forbid air traffic controllers to change shifts with a colleague if the minimum resting period is not respected.
d) restrict air traffic controllers from working an overnight shift after having spent a day away from their post.
e) prohibit air control towers staff from working shifts with a 9 hour resting period in between.
www.tecconcursos.com.br/questoes/199184
Here’s some news of workers sleeping on the job that’s downright scary. A news investigation produced a story and footage of air traffic controllers at Westchester County
Airport sleeping during their shifts. The video, provided to the news outlet by an employee in the air traffic control tower at Westchester Airport, also shows controllers
reading and using laptops and cell phones while on duty. The Federal Aviation Administration (FAA) bans its controllers from use of cell phones, personal reading material
and electric devices while on duty. Sleeping is prohibited anywhere in air traffic control towers.
All of these violations are alarming and dangerous, and pose a serious public safety problem. It is important, I believe, to separate the issue of air traffic controllers
sleeping on the job from their choice to play with laptops and cell phones when they are supposed to be working. The video images showing air traffic controllers slumped
over and sleeping at their stations is truly frightening. But the issue of sleep deprivation among air traffic controllers is a very real one, and means that some instances of
falling asleep—however dangerous and wrong—is not entirely the controllers’ fault, or even within their control.
Unfortunately this is not a new problem. We’ve seen several instances of air traffic controllers falling asleep on duty in recent months.
In response to these cases, the FAA in 2011 revised its regulations for air traffic controllers to include additional time for rest between shifts. The FAA:
Raised the minimum amount of time off between work shifts to 9 hours from 8 hours
Prohibited air traffic controllers from swapping shifts without having a minimum of 9 hours off in between shifts
Increased supervisor coverage in air traffic control towers during late night and early morning shifts
Prohibited air traffic controllers from picking up an overnight shift after a day off
Shift workers of all types face challenges to getting enough sleep while managing long hours, overnight shifts, and changing schedules that fluctuate between day and
night. Research shows that:
People who engage in shift work get less sleep overall than those of us who work more regular hours
Shift workers are at higher risk for illness and chronic disease
The sleep deprivation associated with shift work increase the risk of accidents, injuries and mistakes in high-profile, public safety related industries like medicine
and law enforcement, as well as air traffic control
In addition to making people more prone to accidents and injury, sleep deprivation causes a number of negative effects—both physical and psychological—that can impair
the on-the-job performance of air traffic controllers and other shift workers. Sleep deprivation:
I think we can all agree that we don’t want the people responsible for guiding our planes to be sluggish, slow-reacting, forgetful, fatigued and of questionable judgment.
But that’s exactly what being sleep deprived can make them!
It’s the FAAs responsibility to create workplace regulations that enable air traffic controllers to get the rest they need. This can include not just mandating reasonable time
off between shifts, but also giving controllers breaks during shifts and allowing them to nap on their breaks. There are also some basic things that the controllers
themselves—or any shift workers—can do to help avoid sleep deprivation:
Make sure to get adequate rest before a shift begins. Take a nap before work, if need be.
Limit your reliance on caffeine. While it’s okay as an occasional pick-me-up, coffee and caffeinated beverages are not substitutes for adequate sleep. And caffeine
can interfere with your sleep when you actually want and need to be sleepy.
Keep a strong and consistent sleep routine both during your workdays and your days off. It’s not always easy, but shift workers in particular need to build their off-
duty schedules around making sure they get the sleep they need.
Similarly to the recent changes in health care, the FAA is moving in the right direction to help its employees get the sleep they need to do their jobs safely. As this latest
incident at Westchester Airport confirms, there is a great deal of work still to be done. And it’s in everyone’s best—and safest—interest that progress continues to be
made.
Sweet Dreams,
According to the author of Text, it has been proven scientifically that workers who have jobs that demand they work in shifts
a) manage their time inefficiently if they are not allowed to take a nap.
b) get fewer hours of sleep than workers who have routine schedules.
c) are in greater danger of committing serious mistakes if assigned to less demanding jobs.
d) are more liable to accidents and injuries in their leisure hours.
e) are unlikely to develop severe health disorders.
www.tecconcursos.com.br/questoes/199185
Here’s some news of workers sleeping on the job that’s downright scary. A news investigation produced a story and footage of air traffic controllers at Westchester County
Airport sleeping during their shifts. The video, provided to the news outlet by an employee in the air traffic control tower at Westchester Airport, also shows controllers
reading and using laptops and cell phones while on duty. The Federal Aviation Administration (FAA) bans its controllers from use of cell phones, personal reading material
and electric devices while on duty. Sleeping is prohibited anywhere in air traffic control towers.
All of these violations are alarming and dangerous, and pose a serious public safety problem. It is important, I believe, to separate the issue of air traffic controllers
sleeping on the job from their choice to play with laptops and cell phones when they are supposed to be working. The video images showing air traffic controllers slumped
over and sleeping at their stations is truly frightening. But the issue of sleep deprivation among air traffic controllers is a very real one, and means that some instances of
falling asleep—however dangerous and wrong—is not entirely the controllers’ fault, or even within their control.
Unfortunately this is not a new problem. We’ve seen several instances of air traffic controllers falling asleep on duty in recent months.
In response to these cases, the FAA in 2011 revised its regulations for air traffic controllers to include additional time for rest between shifts. The FAA:
Raised the minimum amount of time off between work shifts to 9 hours from 8 hours
Prohibited air traffic controllers from swapping shifts without having a minimum of 9 hours off in between shifts
Increased supervisor coverage in air traffic control towers during late night and early morning shifts
Prohibited air traffic controllers from picking up an overnight shift after a day off
These adjustments are a step in the right direction, but they don’t go far enough. Managing schedules for shift workers in these high-pressure jobs where public safety is
at stake is too important to settle for improvements that don’t actually solve the problem.
Shift workers of all types face challenges to getting enough sleep while managing long hours, overnight shifts, and changing schedules that fluctuate between day and
People who engage in shift work get less sleep overall than those of us who work more regular hours
Shift workers are at higher risk for illness and chronic disease
The sleep deprivation associated with shift work increase the risk of accidents, injuries and mistakes in high-profile, public safety related industries like medicine
and law enforcement, as well as air traffic control
In addition to making people more prone to accidents and injury, sleep deprivation causes a number of negative effects—both physical and psychological—that can impair
the on-the-job performance of air traffic controllers and other shift workers. Sleep deprivation:
I think we can all agree that we don’t want the people responsible for guiding our planes to be sluggish, slow-reacting, forgetful, fatigued and of questionable judgment.
But that’s exactly what being sleep deprived can make them!
It’s the FAAs responsibility to create workplace regulations that enable air traffic controllers to get the rest they need. This can include not just mandating reasonable time
off between shifts, but also giving controllers breaks during shifts and allowing them to nap on their breaks. There are also some basic things that the controllers
themselves—or any shift workers—can do to help avoid sleep deprivation:
Make sure to get adequate rest before a shift begins. Take a nap before work, if need be.
Limit your reliance on caffeine. While it’s okay as an occasional pick-me-up, coffee and caffeinated beverages are not substitutes for adequate sleep. And caffeine
can interfere with your sleep when you actually want and need to be sleepy.
Keep a strong and consistent sleep routine both during your workdays and your days off. It’s not always easy, but shift workers in particular need to build their off-
duty schedules around making sure they get the sleep they need.
Similarly to the recent changes in health care, the FAA is moving in the right direction to help its employees get the sleep they need to do their jobs safely. As this latest
incident at Westchester Airport confirms, there is a great deal of work still to be done. And it’s in everyone’s best—and safest—interest that progress continues to be
made.
Sweet Dreams,
a) fatigue
b) faulty memory
c) mistaken decisions
d) psychiatric disorders
e) retarded reactions
www.tecconcursos.com.br/questoes/199187
President Obama lectured air traffic controllers in an exclusive interview with ABC News, impressing on them the enormous responsibility of safeguarding flying passengers
and telling them, “You better do your job.”
The president spoke after several controllers were caught asleep on the job and the man in charge of air traffic control, Hank Krakowski, resigned on Thursday.
“The individuals who are falling asleep on the job, that’s unacceptable,” the president told ABC News’ George Stephanopoulos in an exclusive interview on Thursday. “The
fact is, when you’re responsible for the lives and safety of people up in the air, you better do your job. So, there’s an element of individual responsibility that has to be
dealt with.”
Five controllers have been suspended for apparently napping on the job while planes were trying to land at their airports.
The president said a full review of air traffic control work shifts is under way.
“What we also have to look at is air traffic control systems. Do we have enough back up? Do we have enough people? Are they getting enough rest time?” Obama said.
In March, two commercial airliners were forced to land unassisted at Washington, D.C.’s Reagan National Airport after a controller apparently fell asleep.
Just days later, two controllers at the Preston Smith International Airport in Lubbock, Texas, did not hand off control of a departing aircraft to another control center and it
took repeated attempts for them to be reached.
On Feb. 19, an air traffic controller in Knoxville, Tenn., slept during an overnight shift. Sources told ABC News that the worker even took pillows and cushions from a break
room to build a make-shift bed on the control room floor.
And this month, there were two more incidents. A controller fell asleep on the job in Seattle, and days later a controller in Reno was snoozing when a plane carrying a
The FAA and the controller’s union have been studying the fatigue issue for over a year and their report finds that “acute fatigue occurs on a daily basis,” and “fatigue can
occur at any time, on any shift.”
Some sleep experts said controllers are ripe for fatigue because they often bounce between day shifts and night shifts. “When we’re constantly having to adjust to different
work schedules, our body is always playing catch up,” said Philip Gehrman, Director of the Behavioral Sleep Program at the University of Pennsylvania.
Controllers on the night shift have another hurdle: they often work in dim light conditions with little stimulation between radio calls. “That’s exactly the kind of type of task
that’s hardest to maintain, when you’re at the wrong point in your biological rhythms,” said Gehrman.
One recommendation from the government study suggests allowing controllers to take scheduled naps, with breaks as long as two and a half hours to allow for sleeping
and waking up.
Sleep experts said a long break in the middle of an eight hour overnight shift would help, but it might be a tough sell politically. It has taken decades to try to come up with
new fatigue rules for pilots and it may not be any easier when it comes to controllers.
President Obama’s warning to air traffic controllers “‘You better do your job.’” (line 2, Text) can be rephrased as
www.tecconcursos.com.br/questoes/199190
President Obama lectured air traffic controllers in an exclusive interview with ABC News, impressing on them the enormous responsibility of safeguarding flying passengers
and telling them, “You better do your job.”
The president spoke after several controllers were caught asleep on the job and the man in charge of air traffic control, Hank Krakowski, resigned on Thursday.
“The individuals who are falling asleep on the job, that’s unacceptable,” the president told ABC News’ George Stephanopoulos in an exclusive interview on Thursday. “The
fact is, when you’re responsible for the lives and safety of people up in the air, you better do your job. So, there’s an element of individual responsibility that has to be
dealt with.”
Five controllers have been suspended for apparently napping on the job while planes were trying to land at their airports.
The president said a full review of air traffic control work shifts is under way.
“What we also have to look at is air traffic control systems. Do we have enough back up? Do we have enough people? Are they getting enough rest time?” Obama said.
In March, two commercial airliners were forced to land unassisted at Washington, D.C.’s Reagan National Airport after a controller apparently fell asleep.
Just days later, two controllers at the Preston Smith International Airport in Lubbock, Texas, did not hand off control of a departing aircraft to another control center and it
took repeated attempts for them to be reached.
On Feb. 19, an air traffic controller in Knoxville, Tenn., slept during an overnight shift. Sources told ABC News that the worker even took pillows and cushions from a break
room to build a make-shift bed on the control room floor.
And this month, there were two more incidents. A controller fell asleep on the job in Seattle, and days later a controller in Reno was snoozing when a plane carrying a
critically ill passenger was seeking permission to land.
The FAA and the controller’s union have been studying the fatigue issue for over a year and their report finds that “acute fatigue occurs on a daily basis,” and “fatigue can
occur at any time, on any shift.”
Some sleep experts said controllers are ripe for fatigue because they often bounce between day shifts and night shifts. “When we’re constantly having to adjust to different
work schedules, our body is always playing catch up,” said Philip Gehrman, Director of the Behavioral Sleep Program at the University of Pennsylvania.
Controllers on the night shift have another hurdle: they often work in dim light conditions with little stimulation between radio calls. “That’s exactly the kind of type of task
that’s hardest to maintain, when you’re at the wrong point in your biological rhythms,” said Gehrman.
One recommendation from the government study suggests allowing controllers to take scheduled naps, with breaks as long as two and a half hours to allow for sleeping
and waking up.
Sleep experts said a long break in the middle of an eight hour overnight shift would help, but it might be a tough sell politically. It has taken decades to try to come up with
new fatigue rules for pilots and it may not be any easier when it comes to controllers.
a) air traffic controllers are frequently changing shifts and such irregular routine disrupts their biological rhythm.
b) air traffic controllers are generally fatigued because they arrive home late and want to catch up with family news.
c) regular sleep periods at the same time on all days of the week are mandatory.
d) adjusting to varied working hours is like playing a game to catch up on leisure time.
e) dark rooms and monotonous working routines can significantly alter our internal clocks.
www.tecconcursos.com.br/questoes/199191
President Obama lectured air traffic controllers in an exclusive interview with ABC News, impressing on them the enormous responsibility of safeguarding flying passengers
and telling them, “You better do your job.”
The president spoke after several controllers were caught asleep on the job and the man in charge of air traffic control, Hank Krakowski, resigned on Thursday.
“The individuals who are falling asleep on the job, that’s unacceptable,” the president told ABC News’ George Stephanopoulos in an exclusive interview on Thursday. “The
fact is, when you’re responsible for the lives and safety of people up in the air, you better do your job. So, there’s an element of individual responsibility that has to be
dealt with.”
Five controllers have been suspended for apparently napping on the job while planes were trying to land at their airports.
The president said a full review of air traffic control work shifts is under way.
“What we also have to look at is air traffic control systems. Do we have enough back up? Do we have enough people? Are they getting enough rest time?” Obama said.
In March, two commercial airliners were forced to land unassisted at Washington, D.C.’s Reagan National Airport after a controller apparently fell asleep.
Just days later, two controllers at the Preston Smith International Airport in Lubbock, Texas, did not hand off control of a departing aircraft to another control center and it
took repeated attempts for them to be reached.
On Feb. 19, an air traffic controller in Knoxville, Tenn., slept during an overnight shift. Sources told ABC News that the worker even took pillows and cushions from a break
room to build a make-shift bed on the control room floor.
And this month, there were two more incidents. A controller fell asleep on the job in Seattle, and days later a controller in Reno was snoozing when a plane carrying a
critically ill passenger was seeking permission to land.
The FAA and the controller’s union have been studying the fatigue issue for over a year and their report finds that “acute fatigue occurs on a daily basis,” and “fatigue can
occur at any time, on any shift.”
Some sleep experts said controllers are ripe for fatigue because they often bounce between day shifts and night shifts. “When we’re constantly having to adjust to different
work schedules, our body is always playing catch up,” said Philip Gehrman, Director of the Behavioral Sleep Program at the University of Pennsylvania.
Controllers on the night shift have another hurdle: they often work in dim light conditions with little stimulation between radio calls. “That’s exactly the kind of type of task
that’s hardest to maintain, when you’re at the wrong point in your biological rhythms,” said Gehrman.
One recommendation from the government study suggests allowing controllers to take scheduled naps, with breaks as long as two and a half hours to allow for sleeping
and waking up.
Sleep experts said a long break in the middle of an eight hour overnight shift would help, but it might be a tough sell politically. It has taken decades to try to come up with
new fatigue rules for pilots and it may not be any easier when it comes to controllers.
The fragment of Text “but it might be a tough sell politically.” (line 30) implies that it would be
a) easy to sell the idea that air traffic controllers need political representatives.
b) hard to convince air traffic management that controllers need long breaks during their working shifts.
c) fair to blame the working conditions of air traffic controllers on politicians who defend new job legislation.
d) possible to persuade politicians to take longer intervals between working shifts.
e) difficult to argument that sleep experts understand the reasons for sleep disorders of air traffic controllers.
www.tecconcursos.com.br/questoes/199196
Here’s some news of workers sleeping on the job that’s downright scary. A news investigation produced a story and footage of air traffic controllers at Westchester County
Airport sleeping during their shifts. The video, provided to the news outlet by an employee in the air traffic control tower at Westchester Airport, also shows controllers
reading and using laptops and cell phones while on duty. The Federal Aviation Administration (FAA) bans its controllers from use of cell phones, personal reading material
and electric devices while on duty. Sleeping is prohibited anywhere in air traffic control towers.
All of these violations are alarming and dangerous, and pose a serious public safety problem. It is important, I believe, to separate the issue of air traffic controllers
sleeping on the job from their choice to play with laptops and cell phones when they are supposed to be working. The video images showing air traffic controllers slumped
over and sleeping at their stations is truly frightening. But the issue of sleep deprivation among air traffic controllers is a very real one, and means that some instances of
falling asleep—however dangerous and wrong—is not entirely the controllers’ fault, or even within their control.
Unfortunately this is not a new problem. We’ve seen several instances of air traffic controllers falling asleep on duty in recent months.
In response to these cases, the FAA in 2011 revised its regulations for air traffic controllers to include additional time for rest between shifts. The FAA:
Raised the minimum amount of time off between work shifts to 9 hours from 8 hours
Prohibited air traffic controllers from swapping shifts without having a minimum of 9 hours off in between shifts
Increased supervisor coverage in air traffic control towers during late night and early morning shifts
Prohibited air traffic controllers from picking up an overnight shift after a day off
These adjustments are a step in the right direction, but they don’t go far enough. Managing schedules for shift workers in these high-pressure jobs where public safety is
at stake is too important to settle for improvements that don’t actually solve the problem.
Shift workers of all types face challenges to getting enough sleep while managing long hours, overnight shifts, and changing schedules that fluctuate between day and
night. Research shows that:
People who engage in shift work get less sleep overall than those of us who work more regular hours
Shift workers are at higher risk for illness and chronic disease
The sleep deprivation associated with shift work increase the risk of accidents, injuries and mistakes in high-profile, public safety related industries like medicine
and law enforcement, as well as air traffic control
In addition to making people more prone to accidents and injury, sleep deprivation causes a number of negative effects—both physical and psychological—that can impair
the on-the-job performance of air traffic controllers and other shift workers. Sleep deprivation:
I think we can all agree that we don’t want the people responsible for guiding our planes to be sluggish, slow-reacting, forgetful, fatigued and of questionable judgment.
But that’s exactly what being sleep deprived can make them!
It’s the FAAs responsibility to create workplace regulations that enable air traffic controllers to get the rest they need. This can include not just mandating reasonable time
off between shifts, but also giving controllers breaks during shifts and allowing them to nap on their breaks. There are also some basic things that the controllers
themselves—or any shift workers—can do to help avoid sleep deprivation:
Make sure to get adequate rest before a shift begins. Take a nap before work, if need be.
Limit your reliance on caffeine. While it’s okay as an occasional pick-me-up, coffee and caffeinated beverages are not substitutes for adequate sleep. And caffeine
can interfere with your sleep when you actually want and need to be sleepy.
Keep a strong and consistent sleep routine both during your workdays and your days off. It’s not always easy, but shift workers in particular need to build their off-
duty schedules around making sure they get the sleep they need.
Similarly to the recent changes in health care, the FAA is moving in the right direction to help its employees get the sleep they need to do their jobs safely. As this latest
incident at Westchester Airport confirms, there is a great deal of work still to be done. And it’s in everyone’s best—and safest—interest that progress continues to be
made.
Sweet Dreams,
Text II
President Obama lectured air traffic controllers in an exclusive interview with ABC News, impressing on them the enormous responsibility of safeguarding flying passengers
and telling them, “You better do your job.”
The president spoke after several controllers were caught asleep on the job and the man in charge of air traffic control, Hank Krakowski, resigned on Thursday.
“The individuals who are falling asleep on the job, that’s unacceptable,” the president told ABC News’ George Stephanopoulos in an exclusive interview on Thursday. “The
fact is, when you’re responsible for the lives and safety of people up in the air, you better do your job. So, there’s an element of individual responsibility that has to be
dealt with.”
Five controllers have been suspended for apparently napping on the job while planes were trying to land at their airports.
The president said a full review of air traffic control work shifts is under way.
“What we also have to look at is air traffic control systems. Do we have enough back up? Do we have enough people? Are they getting enough rest time?” Obama said.
In March, two commercial airliners were forced to land unassisted at Washington, D.C.’s Reagan National Airport after a controller apparently fell asleep.
Just days later, two controllers at the Preston Smith International Airport in Lubbock, Texas, did not hand off control of a departing aircraft to another control center and it
took repeated attempts for them to be reached.
On Feb. 19, an air traffic controller in Knoxville, Tenn., slept during an overnight shift. Sources told ABC News that the worker even took pillows and cushions from a break
room to build a make-shift bed on the control room floor.
And this month, there were two more incidents. A controller fell asleep on the job in Seattle, and days later a controller in Reno was snoozing when a plane carrying a
critically ill passenger was seeking permission to land.
The FAA and the controller’s union have been studying the fatigue issue for over a year and their report finds that “acute fatigue occurs on a daily basis,” and “fatigue can
occur at any time, on any shift.”
Some sleep experts said controllers are ripe for fatigue because they often bounce between day shifts and night shifts. “When we’re constantly having to adjust to different
work schedules, our body is always playing catch up,” said Philip Gehrman, Director of the Behavioral Sleep Program at the University of Pennsylvania.
Controllers on the night shift have another hurdle: they often work in dim light conditions with little stimulation between radio calls. “That’s exactly the kind of type of task
that’s hardest to maintain, when you’re at the wrong point in your biological rhythms,” said Gehrman.
One recommendation from the government study suggests allowing controllers to take scheduled naps, with breaks as long as two and a half hours to allow for sleeping
and waking up.
Sleep experts said a long break in the middle of an eight hour overnight shift would help, but it might be a tough sell politically. It has taken decades to try to come up with
new fatigue rules for pilots and it may not be any easier when it comes to controllers.
a) only Text I discusses how the absence of sleep can disturb the routine of air traffic controllers.
b) only Text II introduces a list of recommendations to improve the on-the-job performance of air traffic controllers.
c) neither Text I nor Text II seem to be concerned with improving air traffic controllers’ health conditions.
d) both Text I and Text II mention alarming situations resulting from air traffic controllers’ sleep deprivation.
e) Text I tries to justify why air traffic controllers constantly fall asleep while on duty, while Text II only condemns their improper behavior at work.
www.tecconcursos.com.br/questoes/220681
Although far fewer women work in the oil and gas (O&G) industry compared to men, many women find rewarding careers in the industry. Five women were asked the
same questions regarding their career choices in the oil and gas industry.
Question 1: Why did you choose the oil and gas industry?
Woman 3: They offered me a job! I couldn’t turn down the great starting salary and a chance to live in New Orleans.
Woman 4: I did not really choose the oil and gas industry as much as it chose me.
Woman 5: I chose the oil and gas industry because of the challenging projects, and I want to be part of our country’s energy solution.
Question 2: How did you get your start in the oil and gas industry?
Woman 1: I went to a university that all major oil companies recruit. I received a summer internship with Texaco before my last year of my Master’s degree.
Woman 3: At the time, campus recruiters came to the geosciences department of my university annually and they sponsored scholarships for graduate students to help
complete their research. Even though my Master’s thesis was more geared toward environmental studies, as a recipient of one of these scholarships, my graduate advisor
strongly encouraged me to participate when the time came for O&G Industry interviews.
Woman 4: I was working for a company in another state where oil and gas was not its primary business. When the company sold its division in the state where I was
working, they offered me a position at the company’s headquarters in Houston managing the aftermarket sales for the company’s largest region. Aftermarket sales
supported the on-highway, construction, industrial, agricultural and the oil and gas markets. After one year, the company asked me to take the position of managing their
marine and offshore power products division. I held that position for three years. I left that company to join a new startup company where I hold the position of president.
Woman 5: My first job in the oil and gas industry was an internship with Mobil Oil Corp., in New Orleans.I worked with a lot of smart, focused and talented geoscientists
and engineers.
Woman 1: Tough one to describe a typical day. I generally read email, go to a couple of meetings and work with the field’s earth model or look at seismic.
Woman 2: I talk with clients, help prepare bids and work on getting projects out the door. My days are never the same, which is what I love about the job I have.
Woman 3: I usually work from 7:30 a.m. – 6:30 p.m. (although the official day is shorter). We call the field every morning for an update on operations, security,
construction, facilities and production engineering activities. I work with my team leads on short-term and long-term projects to enhance production (a lot of emails and
Powerpoint). I usually have 2-3 meetings per day to discuss/prioritize/review ongoing or upcoming work (production optimization, simulation modeling, drilling plans,
geologic interpretation, workovers, etc.). Beyond our team, I also participate in a number of broader business initiatives and leadership teams.
Woman 4: A typical day is a hectic day for me. My day usually starts well before 8 a.m. with phone calls and emails with our facility in Norway, as well as other business
relationships abroad. At the office, I am involved in the daily business operations and also stay closely involved in the projects and the sales efforts. On any given day I am
working on budgets and finance, attending project meetings, attending engineering meetings, reviewing drawings and technical specifications, meeting with clients and
prospective clients, reviewing sales proposals, evaluating new business opportunities and making a lot of decisions.
Woman 5: On most days I work on my computer to complete my projects. I interpret logs, create maps, research local and regional geology or write documents. I go to
project meetings almost every day. I typically work only during business hours, but there are times when I get calls at night or on weekends from a rig or other geologists
for assistance with a technical problem.
According to Text, when asked about their choice of the oil and gas industry,
a) all the interviewees pointed out the relevance of having a green job.
b) all the women felt really committed to solving the nation’s energy problems.
c) all the interviewees mentioned that the challenges of the field attracted them.
d) just one of the women commented that she was attracted by the location of the job.
e) no interviewee considered the salary an important factor for accepting the job.
www.tecconcursos.com.br/questoes/220682
Although far fewer women work in the oil and gas (O&G) industry compared to men, many women find rewarding careers in the industry. Five women were asked the
same questions regarding their career choices in the oil and gas industry.
Question 1: Why did you choose the oil and gas industry?
Woman 3: They offered me a job! I couldn’t turn down the great starting salary and a chance to live in New Orleans.
Woman 4: I did not really choose the oil and gas industry as much as it chose me.
Woman 5: I chose the oil and gas industry because of the challenging projects, and I want to be part of our country’s energy solution.
Question 2: How did you get your start in the oil and gas industry?
Woman 1: I went to a university that all major oil companies recruit. I received a summer internship with Texaco before my last year of my Master’s degree.
Woman 3: At the time, campus recruiters came to the geosciences department of my university annually and they sponsored scholarships for graduate students to help
complete their research. Even though my Master’s thesis was more geared toward environmental studies, as a recipient of one of these scholarships, my graduate advisor
strongly encouraged me to participate when the time came for O&G Industry interviews.
Woman 4: I was working for a company in another state where oil and gas was not its primary business. When the company sold its division in the state where I was
working, they offered me a position at the company’s headquarters in Houston managing the aftermarket sales for the company’s largest region. Aftermarket sales
supported the on-highway, construction, industrial, agricultural and the oil and gas markets. After one year, the company asked me to take the position of managing their
marine and offshore power products division. I held that position for three years. I left that company to join a new startup company where I hold the position of president.
Woman 5: My first job in the oil and gas industry was an internship with Mobil Oil Corp., in New Orleans.I worked with a lot of smart, focused and talented geoscientists
and engineers.
Woman 1: Tough one to describe a typical day. I generally read email, go to a couple of meetings and work with the field’s earth model or look at seismic.
Woman 2: I talk with clients, help prepare bids and work on getting projects out the door. My days are never the same, which is what I love about the job I have.
Woman 3: I usually work from 7:30 a.m. – 6:30 p.m. (although the official day is shorter). We call the field every morning for an update on operations, security,
construction, facilities and production engineering activities. I work with my team leads on short-term and long-term projects to enhance production (a lot of emails and
Powerpoint). I usually have 2-3 meetings per day to discuss/prioritize/review ongoing or upcoming work (production optimization, simulation modeling, drilling plans,
geologic interpretation, workovers, etc.). Beyond our team, I also participate in a number of broader business initiatives and leadership teams.
Woman 4: A typical day is a hectic day for me. My day usually starts well before 8 a.m. with phone calls and emails with our facility in Norway, as well as other business
relationships abroad. At the office, I am involved in the daily business operations and also stay closely involved in the projects and the sales efforts. On any given day I am
Woman 5: On most days I work on my computer to complete my projects. I interpret logs, create maps, research local and regional geology or write documents. I go to
project meetings almost every day. I typically work only during business hours, but there are times when I get calls at night or on weekends from a rig or other geologists
for assistance with a technical problem.
In Text, using the interviewees’ experience, it can be said that getting a job in the O&G industry can result from all the following situations, EXCEPT
a) participating in a job fair.
b) taking part in O&G Industry interviews.
c) applying to specific job ads via internet sites.
d) attending a university where major oil companies look for prospective employees.
e) getting previous experience in an internship program with an O&G organization.
www.tecconcursos.com.br/questoes/220684
Although far fewer women work in the oil and gas (O&G) industry compared to men, many women find rewarding careers in the industry. Five women were asked the
same questions regarding their career choices in the oil and gas industry.
Question 1: Why did you choose the oil and gas industry?
Woman 3: They offered me a job! I couldn’t turn down the great starting salary and a chance to live in New Orleans.
Woman 4: I did not really choose the oil and gas industry as much as it chose me.
Woman 5: I chose the oil and gas industry because of the challenging projects, and I want to be part of our country’s energy solution.
Question 2: How did you get your start in the oil and gas industry?
Woman 1: I went to a university that all major oil companies recruit. I received a summer internship with Texaco before my last year of my Master’s degree.
Woman 3: At the time, campus recruiters came to the geosciences department of my university annually and they sponsored scholarships for graduate students to help
complete their research. Even though my Master’s thesis was more geared toward environmental studies, as a recipient of one of these scholarships, my graduate advisor
strongly encouraged me to participate when the time came for O&G Industry interviews.
Woman 4: I was working for a company in another state where oil and gas was not its primary business. When the company sold its division in the state where I was
working, they offered me a position at the company’s headquarters in Houston managing the aftermarket sales for the company’s largest region. Aftermarket sales
supported the on-highway, construction, industrial, agricultural and the oil and gas markets. After one year, the company asked me to take the position of managing their
marine and offshore power products division. I held that position for three years. I left that company to join a new startup company where I hold the position of president.
Woman 5: My first job in the oil and gas industry was an internship with Mobil Oil Corp., in New Orleans.I worked with a lot of smart, focused and talented geoscientists
and engineers.
Woman 1: Tough one to describe a typical day. I generally read email, go to a couple of meetings and work with the field’s earth model or look at seismic.
Woman 2: I talk with clients, help prepare bids and work on getting projects out the door. My days are never the same, which is what I love about the job I have.
Woman 3: I usually work from 7:30 a.m. – 6:30 p.m. (although the official day is shorter). We call the field every morning for an update on operations, security,
construction, facilities and production engineering activities. I work with my team leads on short-term and long-term projects to enhance production (a lot of emails and
Powerpoint). I usually have 2-3 meetings per day to discuss/prioritize/review ongoing or upcoming work (production optimization, simulation modeling, drilling plans,
geologic interpretation, workovers, etc.). Beyond our team, I also participate in a number of broader business initiatives and leadership teams.
Woman 4: A typical day is a hectic day for me. My day usually starts well before 8 a.m. with phone calls and emails with our facility in Norway, as well as other business
relationships abroad. At the office, I am involved in the daily business operations and also stay closely involved in the projects and the sales efforts. On any given day I am
working on budgets and finance, attending project meetings, attending engineering meetings, reviewing drawings and technical specifications, meeting with clients and
prospective clients, reviewing sales proposals, evaluating new business opportunities and making a lot of decisions.
Woman 5: On most days I work on my computer to complete my projects. I interpret logs, create maps, research local and regional geology or write documents. I go to
project meetings almost every day. I typically work only during business hours, but there are times when I get calls at night or on weekends from a rig or other geologists
for assistance with a technical problem.
www.tecconcursos.com.br/questoes/220764
By Katie Weir
From Talent Acquisition Specialist, Campus
Talisman Energy
Fix up your resumé – take it to your career centre at your university and they’ll help you.
Write a compelling cover letter that speaks to your best qualities – save the pretentious language for your English papers.
Join a professional association and attend their events – if you feel uncomfortable attending alone, try volunteering at them. By having a job to do, it gives you an
excuse to interact with the attendees, and an easy way to start up a conversation the next time you see them.
Do your research – I can’t stress this enough. I want students to apply to Talisman, not because we have open jobs, but because they actually have an interest in what
we’re doing, and want to be a part of it.
Be confident, but stay humble – it’s important to communicate your abilities effectively, but it’s also important to be conscious of the phrase: “sense of entitlement.”
This generation entering the workforce has already been branded with the word “entitlement,” so students will need to fight against this bias from the very beginning of
any relationship with people in the industry – be aware that you will need to roll up your sleeves and work hard for the first couple years, and you will be rewarded in the
end.
Retrieved and adapted from URL: <http://talentegg.ca/incubator/2010/11/29/how-to-start-a-career-in-the-oil-and-gas-industry -what-employers-say/>. Acess on: February 14, 2012.
www.tecconcursos.com.br/questoes/220765
By Katie Weir
From Talent Acquisition Specialist, Campus
Talisman Energy
Fix up your resumé – take it to your career centre at your university and they’ll help you.
Write a compelling cover letter that speaks to your best qualities – save the pretentious language for your English papers.
Join a professional association and attend their events – if you feel uncomfortable attending alone, try volunteering at them. By having a job to do, it gives you an
excuse to interact with the attendees, and an easy way to start up a conversation the next time you see them.
Do your research – I can’t stress this enough. I want students to apply to Talisman, not because we have open jobs, but because they actually have an interest in what
we’re doing, and want to be a part of it.
Be confident, but stay humble – it’s important to communicate your abilities effectively, but it’s also important to be conscious of the phrase: “sense of entitlement.”
This generation entering the workforce has already been branded with the word “entitlement,” so students will need to fight against this bias from the very beginning of
any relationship with people in the industry – be aware that you will need to roll up your sleeves and work hard for the first couple years, and you will be rewarded in the
end.
Retrieved and adapted from URL: <http://talentegg.ca/incubator/2010/11/29/how-to-start-a-career-in-the-oil-and-gas-industry -what-employers-say/>. Acess on: February 14, 2012.
The fragment that closes Text, “be aware that you will need to roll up your sleeves and work hard for the first couple years, and you will be rewarded in the end.”, implies
that one must
a) make an effort to commit totally to one’s job in the initial phase, in order to reach success in the future.
b) wear formal clothes to work so that, as years go by, a couple of top-rank officers can recognize one’s worth.
c) accept jobs with severe routines only in order to obtain early promotions.
d) avoid postponing assigned tasks and wearing inappropriate clothes in the working environment.
e) show commitment to the working routine and demand the rewards frequently offered to senior employees.
Although far fewer women work in the oil and gas (O&G) industry compared to men, many women find rewarding careers in the industry. Five women were asked the
same questions regarding their career choices in the oil and gas industry.
Question 1: Why did you choose the oil and gas industry?
Woman 3: They offered me a job! I couldn’t turn down the great starting salary and a chance to live in New Orleans.
Woman 4: I did not really choose the oil and gas industry as much as it chose me.
Woman 5: I chose the oil and gas industry because of the challenging projects, and I want to be part of our country’s energy solution.
Question 2: How did you get your start in the oil and gas industry?
Woman 1: I went to a university that all major oil companies recruit. I received a summer internship with Texaco before my last year of my Master’s degree.
Woman 3: At the time, campus recruiters came to the geosciences department of my university annually and they sponsored scholarships for graduate students to help
complete their research. Even though my Master’s thesis was more geared toward environmental studies, as a recipient of one of these scholarships, my graduate advisor
strongly encouraged me to participate when the time came for O&G Industry interviews.
Woman 4: I was working for a company in another state where oil and gas was not its primary business. When the company sold its division in the state where I was
working, they offered me a position at the company’s headquarters in Houston managing the aftermarket sales for the company’s largest region. Aftermarket sales
supported the on-highway, construction, industrial, agricultural and the oil and gas markets. After one year, the company asked me to take the position of managing their
marine and offshore power products division. I held that position for three years. I left that company to join a new startup company where I hold the position of president.
Woman 5: My first job in the oil and gas industry was an internship with Mobil Oil Corp., in New Orleans.I worked with a lot of smart, focused and talented geoscientists
and engineers.
Woman 1: Tough one to describe a typical day. I generally read email, go to a couple of meetings and work with the field’s earth model or look at seismic.
Woman 2: I talk with clients, help prepare bids and work on getting projects out the door. My days are never the same, which is what I love about the job I have.
Woman 3: I usually work from 7:30 a.m. – 6:30 p.m. (although the official day is shorter). We call the field every morning for an update on operations, security,
construction, facilities and production engineering activities. I work with my team leads on short-term and long-term projects to enhance production (a lot of emails and
Powerpoint). I usually have 2-3 meetings per day to discuss/prioritize/review ongoing or upcoming work (production optimization, simulation modeling, drilling plans,
geologic interpretation, workovers, etc.). Beyond our team, I also participate in a number of broader business initiatives and leadership teams.
Woman 4: A typical day is a hectic day for me. My day usually starts well before 8 a.m. with phone calls and emails with our facility in Norway, as well as other business
relationships abroad. At the office, I am involved in the daily business operations and also stay closely involved in the projects and the sales efforts. On any given day I am
working on budgets and finance, attending project meetings, attending engineering meetings, reviewing drawings and technical specifications, meeting with clients and
prospective clients, reviewing sales proposals, evaluating new business opportunities and making a lot of decisions.
Woman 5: On most days I work on my computer to complete my projects. I interpret logs, create maps, research local and regional geology or write documents. I go to
project meetings almost every day. I typically work only during business hours, but there are times when I get calls at night or on weekends from a rig or other geologists
for assistance with a technical problem.
Text II
By Katie Weir
From Talent Acquisition Specialist, Campus
Talisman Energy
Fix up your resumé – take it to your career centre at your university and they’ll help you.
Write a compelling cover letter that speaks to your best qualities – save the pretentious language for your English papers.
Join a professional association and attend their events – if you feel uncomfortable attending alone, try volunteering at them. By having a job to do, it gives you an
excuse to interact with the attendees, and an easy way to start up a conversation the next time you see them.
Do your research – I can’t stress this enough. I want students to apply to Talisman, not because we have open jobs, but because they actually have an interest in what
we’re doing, and want to be a part of it.
Be confident, but stay humble – it’s important to communicate your abilities effectively, but it’s also important to be conscious of the phrase: “sense of entitlement.”
Retrieved and adapted from URL: <http://talentegg.ca/incubator/2010/11/29/how-to-start-a-career-in-the-oil-and-gas-industry -what-employers-say/>. Acess on: February 14, 2012.
www.tecconcursos.com.br/questoes/291620
By Jessica Tippee
Assistant Editor
Not Mexico, not Brazil. The next offshore frontier is the Arctic, according to Andrew Reid, CEO of energy analysts Douglas-Westwood Company. “More than 400 fields have
been discovered to date in the Arctic, providing reserves in excess of 240 Bboe (billions of barrels of oil equivalent)” Reid said. He was a guest speaker at a recent
conference of the International Association of Drilling Contractors (IADC), an agency that has exclusively represented the worldwide oil and gas drilling industry since 1940.
Reid also affirmed that “There is no doubt that further drilling activity in this region could have a major impact on offshore production in the foreseeable future.”
Meanwhile, Infield Systems Ltd. has identified more than 130 Bboe in discovered oil, gas, and condensate reserves throughout the offshore arctic and sub-arctic regions.
Around 114 Bboe are gas reserves, and 16 Bbbl (billions of barrels) are oil. Infield’s additional report on offshore arctic oil and gas prospects through 2017 includes current
and future offshore oil and gas developments within the Arctic Circle, and in the “sub-arctic” regions of Sakhalin Island, the Jeanne d’Arc basin offshore eastern Canada,
and the Cook Inlet off Alaska.
Arctic capital expenditure should increase more than $7 billion annually through 2017. Russia, with its reserves, should largely drive this expenditure, especially during
2013-2015, assuming the Shtokman project goes ahead. This project includes a comprehensive development of satellites in the Barents Sea, and joint development of the
Prirazlomnoye and Dolginskoye oil fields in the Pechora Sea.
Prirazlomnaya is the first offshore ice-resistant stationary platform designed and built in Russia measuring 126 m (413 ft) wide by 126 m long. With a weight of 117,000
tons, the platform can accommodate a crew of up to 200, and provide year-round operation.
The platform is designed to withstand temperatures that can drop to −50º C (−58º F) during winter, and ice formation – the location is typically free from ice for 110 days
each year. The platform will provide drilling, production, and oil storage services, along with preparation and shipment of final products from the Prirazlomnoye field.
Gazprom expects to drill up to 40 directional wells. Dutch contractor Tideway has been dumping 100,000 metric tons of stone (110,231 tons) as an erosion protection
system around the platform to secure it to the seabed. The development is targeting annual production of more than 6 million tons (43.8 MMbbl). Associated produced gas
will be used for the platform’s needs. Production operations are scheduled to start this year.
Offshore Magazine. May 2, 2012 . Volume 72, Issue 5 Available at: <http://www.offshore-mag.com/articles/print/volume-72/issue-5/international-report/arctic-e-p-activity-heats-up.html>. Retrieved on:
9 May 2012. Adapted.
www.tecconcursos.com.br/questoes/291621
By Jessica Tippee
Assistant Editor
Not Mexico, not Brazil. The next offshore frontier is the Arctic, according to Andrew Reid, CEO of energy analysts Douglas-Westwood Company. “More than 400 fields have
been discovered to date in the Arctic, providing reserves in excess of 240 Bboe (billions of barrels of oil equivalent)” Reid said. He was a guest speaker at a recent
conference of the International Association of Drilling Contractors (IADC), an agency that has exclusively represented the worldwide oil and gas drilling industry since 1940.
Reid also affirmed that “There is no doubt that further drilling activity in this region could have a major impact on offshore production in the foreseeable future.”
Meanwhile, Infield Systems Ltd. has identified more than 130 Bboe in discovered oil, gas, and condensate reserves throughout the offshore arctic and sub-arctic regions.
Around 114 Bboe are gas reserves, and 16 Bbbl (billions of barrels) are oil. Infield’s additional report on offshore arctic oil and gas prospects through 2017 includes current
and future offshore oil and gas developments within the Arctic Circle, and in the “sub-arctic” regions of Sakhalin Island, the Jeanne d’Arc basin offshore eastern Canada,
and the Cook Inlet off Alaska.
Arctic capital expenditure should increase more than $7 billion annually through 2017. Russia, with its reserves, should largely drive this expenditure, especially during
2013-2015, assuming the Shtokman project goes ahead. This project includes a comprehensive development of satellites in the Barents Sea, and joint development of the
Prirazlomnoye and Dolginskoye oil fields in the Pechora Sea.
Prirazlomnaya is the first offshore ice-resistant stationary platform designed and built in Russia measuring 126 m (413 ft) wide by 126 m long. With a weight of 117,000
tons, the platform can accommodate a crew of up to 200, and provide year-round operation.
Offshore Magazine. May 2, 2012 . Volume 72, Issue 5 Available at: <http://www.offshore-mag.com/articles/print/volume-72/issue-5/international-report/arctic-e-p-activity-heats-up.html>. Retrieved on:
9 May 2012. Adapted.
a) located 16 Bbbl of oil throughout the offshore arctic and sub-arctic regions.
b) reported 114 Bboe of gas prospects.
c) started exploring the Jeanne d’Arc basin offshore eastern Canada, but will only include the Cook Inlet off Alaska in 2017.
d) found 130 Bboe in oil and gas on Sakhalin Island.
e) broadcast a potential for more than 130 Bboe in gas reserves, but only expects to find it by 2017.
www.tecconcursos.com.br/questoes/291622
By Jessica Tippee
Assistant Editor
Not Mexico, not Brazil. The next offshore frontier is the Arctic, according to Andrew Reid, CEO of energy analysts Douglas-Westwood Company. “More than 400 fields have
been discovered to date in the Arctic, providing reserves in excess of 240 Bboe (billions of barrels of oil equivalent)” Reid said. He was a guest speaker at a recent
conference of the International Association of Drilling Contractors (IADC), an agency that has exclusively represented the worldwide oil and gas drilling industry since 1940.
Reid also affirmed that “There is no doubt that further drilling activity in this region could have a major impact on offshore production in the foreseeable future.”
Meanwhile, Infield Systems Ltd. has identified more than 130 Bboe in discovered oil, gas, and condensate reserves throughout the offshore arctic and sub-arctic regions.
Around 114 Bboe are gas reserves, and 16 Bbbl (billions of barrels) are oil. Infield’s additional report on offshore arctic oil and gas prospects through 2017 includes current
and future offshore oil and gas developments within the Arctic Circle, and in the “sub-arctic” regions of Sakhalin Island, the Jeanne d’Arc basin offshore eastern Canada,
and the Cook Inlet off Alaska.
Arctic capital expenditure should increase more than $7 billion annually through 2017. Russia, with its reserves, should largely drive this expenditure, especially during
2013-2015, assuming the Shtokman project goes ahead. This project includes a comprehensive development of satellites in the Barents Sea, and joint development of the
Prirazlomnoye and Dolginskoye oil fields in the Pechora Sea.
Prirazlomnaya is the first offshore ice-resistant stationary platform designed and built in Russia measuring 126 m (413 ft) wide by 126 m long. With a weight of 117,000
tons, the platform can accommodate a crew of up to 200, and provide year-round operation.
The platform is designed to withstand temperatures that can drop to −50º C (−58º F) during winter, and ice formation – the location is typically free from ice for 110 days
each year. The platform will provide drilling, production, and oil storage services, along with preparation and shipment of final products from the Prirazlomnoye field.
Gazprom expects to drill up to 40 directional wells. Dutch contractor Tideway has been dumping 100,000 metric tons of stone (110,231 tons) as an erosion protection
system around the platform to secure it to the seabed. The development is targeting annual production of more than 6 million tons (43.8 MMbbl). Associated produced gas
will be used for the platform’s needs. Production operations are scheduled to start this year.
Offshore Magazine. May 2, 2012 . Volume 72, Issue 5 Available at: <http://www.offshore-mag.com/articles/print/volume-72/issue-5/international-report/arctic-e-p-activity-heats-up.html>. Retrieved on:
9 May 2012. Adapted.
www.tecconcursos.com.br/questoes/291625
By John Phillip
Americans drink more than 216 liters of carbonated soft drinks each year, a number that continues to increase at an alarming rate. Many people use lowcalorie diet soda in
a futile effort to lose weight. Yet they find that these drinks have the opposite effect leading them to be overweight or obese.
The high acid content in most carbonated beverages removes calcium and other critical nutrients from the bone and tissues, significantly increasing disease risk over years
of consumption.
Researchers from Cleveland Clinic’s Wellness Institute and Harvard University have reported the result of a study in the American Journal of Clinical Nutrition, the first to
examine soda’s effect on stroke risk and vascular diseases.
Past studies have linked sugar-sweetened beverage consumption with weight gain, diabetes, high blood pressure, high cholesterol, gout and coronary artery disease, but
current research has implicated diet soft drink consumption with increased disease risk and weight gain due to depletion of essential minerals.
Lead study author Dr Adam Bernstein noted “Soda remains the largest source of added sugar in the diet. What we’re beginning to understand is that regular intake of
these beverages sets off a chain reaction in the body that can potentially lead to many diseases, including stroke. Researchers analyzed soda consumption among 43,371
men and 84,085 women over a time span of nearly thirty years. During that time, 2,938 strokes were documented in women while 1,416 strokes were documented in
men.”
Regarding low calorie drinks, researchers concluded “older adults who drank diet soda daily had a 43% increased risk of heart attacks or strokes compared to those that
never drank diet soda”.
The suggestion is to substitute carbonated beverage consumption with an antioxidant packed cup of green tea or coffee to significantly reduce risk of strokes and vascular
diseases.
Alexander’s Gas & Oil Connections Magazine. May 12, 2012 Available at: <http://www.gasandoil.com/oilaround/other/3425a2d6 a41705a0f36cf3796041db1e>. Retrieved on: 9 May 2012. Adapted.
www.tecconcursos.com.br/questoes/291626
By John Phillip
Americans drink more than 216 liters of carbonated soft drinks each year, a number that continues to increase at an alarming rate. Many people use lowcalorie diet soda in
a futile effort to lose weight. Yet they find that these drinks have the opposite effect leading them to be overweight or obese.
The high acid content in most carbonated beverages removes calcium and other critical nutrients from the bone and tissues, significantly increasing disease risk over years
of consumption.
Researchers from Cleveland Clinic’s Wellness Institute and Harvard University have reported the result of a study in the American Journal of Clinical Nutrition, the first to
examine soda’s effect on stroke risk and vascular diseases.
Past studies have linked sugar-sweetened beverage consumption with weight gain, diabetes, high blood pressure, high cholesterol, gout and coronary artery disease, but
current research has implicated diet soft drink consumption with increased disease risk and weight gain due to depletion of essential minerals.
Lead study author Dr Adam Bernstein noted “Soda remains the largest source of added sugar in the diet. What we’re beginning to understand is that regular intake of
these beverages sets off a chain reaction in the body that can potentially lead to many diseases, including stroke. Researchers analyzed soda consumption among 43,371
men and 84,085 women over a time span of nearly thirty years. During that time, 2,938 strokes were documented in women while 1,416 strokes were documented in
men.”
Despite the millions of dollars spent by soda marketers to instill the virtues of drinking soda, there is nothing healthy about consuming any type of carbonated beverage.
Moreover, the study did note that drinking coffee was associated with a 10% lower risk of stroke, compared to drinking sweetened beverages.
Regarding low calorie drinks, researchers concluded “older adults who drank diet soda daily had a 43% increased risk of heart attacks or strokes compared to those that
never drank diet soda”.
The suggestion is to substitute carbonated beverage consumption with an antioxidant packed cup of green tea or coffee to significantly reduce risk of strokes and vascular
diseases.
Alexander’s Gas & Oil Connections Magazine. May 12, 2012 Available at: <http://www.gasandoil.com/oilaround/other/3425a2d6 a41705a0f36cf3796041db1e>. Retrieved on: 9 May 2012. Adapted.
One negative effect of sugar-sweetened beverage consumption and one negative effect of diet soft drink consumption are respectively
a) coronary artery disease and liver disease
b) gout and vitamin suppression
c) low cholesterol and weight gain
d) mineral suppression and high blood pressure
e) weight gain and essential mineral suppression
www.tecconcursos.com.br/questoes/291627
By John Phillip
Americans drink more than 216 liters of carbonated soft drinks each year, a number that continues to increase at an alarming rate. Many people use lowcalorie diet soda in
a futile effort to lose weight. Yet they find that these drinks have the opposite effect leading them to be overweight or obese.
The high acid content in most carbonated beverages removes calcium and other critical nutrients from the bone and tissues, significantly increasing disease risk over years
of consumption.
Researchers from Cleveland Clinic’s Wellness Institute and Harvard University have reported the result of a study in the American Journal of Clinical Nutrition, the first to
examine soda’s effect on stroke risk and vascular diseases.
Past studies have linked sugar-sweetened beverage consumption with weight gain, diabetes, high blood pressure, high cholesterol, gout and coronary artery disease, but
current research has implicated diet soft drink consumption with increased disease risk and weight gain due to depletion of essential minerals.
Lead study author Dr Adam Bernstein noted “Soda remains the largest source of added sugar in the diet. What we’re beginning to understand is that regular intake of
these beverages sets off a chain reaction in the body that can potentially lead to many diseases, including stroke. Researchers analyzed soda consumption among 43,371
men and 84,085 women over a time span of nearly thirty years. During that time, 2,938 strokes were documented in women while 1,416 strokes were documented in
men.”
Despite the millions of dollars spent by soda marketers to instill the virtues of drinking soda, there is nothing healthy about consuming any type of carbonated beverage.
Moreover, the study did note that drinking coffee was associated with a 10% lower risk of stroke, compared to drinking sweetened beverages.
Regarding low calorie drinks, researchers concluded “older adults who drank diet soda daily had a 43% increased risk of heart attacks or strokes compared to those that
never drank diet soda”.
The suggestion is to substitute carbonated beverage consumption with an antioxidant packed cup of green tea or coffee to significantly reduce risk of strokes and vascular
diseases.
Alexander’s Gas & Oil Connections Magazine. May 12, 2012 Available at: <http://www.gasandoil.com/oilaround/other/3425a2d6 a41705a0f36cf3796041db1e>. Retrieved on: 9 May 2012. Adapted.
www.tecconcursos.com.br/questoes/291918
by Valerie Ross
from Discover Magazine:
Mind & Brain / Memory, Emotions & Decisions
When Timothy Lu was in medical school, he treated a veteran whose multiple sclerosis was so severe that she had to use a urinary catheter. As often happens with
invasive medical devices, the catheters became infected with biofilms: gooey, antibioticresistant layers of bacteria. Now the 30-year-old MIT professor, who first trained as
an engineer, designs viruses that destroy biofilms, which cause everything from staph infections to cholera outbreaks and that account for 65 percent of human infections
overall.
Discover: You started as an electrical engineer. Was it a difficult transition becoming a biologist?
Lu: I came into the lab not really understanding how to do biology experiments and deal with chemicals. I’m not a great experimentalist with my hands, and one night I set
the lab on fire.
Lu: A biofilm is essentially a three-dimensional community of bacteria that live together, kind of like a bacterial apartment building or city. Biofilms are made up of the
bacterial cells as well as all sorts of other material — carbohydrates, proteins, and so on — that the bacteria build to protect themselves.
Lu: Before I started medical school, I didn’t think bacterial infections were a big deal, because I assumed antibiotics had taken care of them, but then I started seeing
patients with significant biofilm infections that couldn’t be cured.
Lu: We use viruses called phages that infect bacteria but not human cells. We cut the phages’ DNA and insert a synthetic gene into the phage genome. That gene
produces enzymes that can go out into the biofilm and chew it up.
Discover: If you had just $10 for entertainment, how would you spend your day?
Lu: What can you even buy with $10? Maybe I would buy a magnifying glass and just peer around in the soil to see what other life was going on down there. That would
actually be fun.
www.tecconcursos.com.br/questoes/291921
by Valerie Ross
When Timothy Lu was in medical school, he treated a veteran whose multiple sclerosis was so severe that she had to use a urinary catheter. As often happens with
invasive medical devices, the catheters became infected with biofilms: gooey, antibioticresistant layers of bacteria. Now the 30-year-old MIT professor, who first trained as
an engineer, designs viruses that destroy biofilms, which cause everything from staph infections to cholera outbreaks and that account for 65 percent of human infections
overall.
Discover: You started as an electrical engineer. Was it a difficult transition becoming a biologist?
Lu: I came into the lab not really understanding how to do biology experiments and deal with chemicals. I’m not a great experimentalist with my hands, and one night I set
the lab on fire.
Lu: A biofilm is essentially a three-dimensional community of bacteria that live together, kind of like a bacterial apartment building or city. Biofilms are made up of the
bacterial cells as well as all sorts of other material — carbohydrates, proteins, and so on — that the bacteria build to protect themselves.
Lu: Before I started medical school, I didn’t think bacterial infections were a big deal, because I assumed antibiotics had taken care of them, but then I started seeing
patients with significant biofilm infections that couldn’t be cured.
Lu: We use viruses called phages that infect bacteria but not human cells. We cut the phages’ DNA and insert a synthetic gene into the phage genome. That gene
produces enzymes that can go out into the biofilm and chew it up.
Discover: If you had just $10 for entertainment, how would you spend your day?
Lu: What can you even buy with $10? Maybe I would buy a magnifying glass and just peer around in the soil to see what other life was going on down there. That would
actually be fun.
a) extracting phages that are infected by a virus that can destroy all enzymes in the bacteria.
b) producing an enzyme that is inserted in a genetically marked bacteria to support viruses that live in the biofilm.
c) triggering a bacterial infection to the viruses that in turn yield enzymes that potently destroy the biofilm.
d) altering a special human-safe virus in order to produce an enzyme that penetrates the biofilm and destroys it.
e) inserting a synthetic gene in the phages genome that will affect the production of virus that get organized into biofilms.
www.tecconcursos.com.br/questoes/291924
by Valerie Ross
from Discover Magazine:
Mind & Brain / Memory, Emotions & Decisions
When Timothy Lu was in medical school, he treated a veteran whose multiple sclerosis was so severe that she had to use a urinary catheter. As often happens with
invasive medical devices, the catheters became infected with biofilms: gooey, antibioticresistant layers of bacteria. Now the 30-year-old MIT professor, who first trained as
an engineer, designs viruses that destroy biofilms, which cause everything from staph infections to cholera outbreaks and that account for 65 percent of human infections
overall.
Discover: You started as an electrical engineer. Was it a difficult transition becoming a biologist?
Lu: I came into the lab not really understanding how to do biology experiments and deal with chemicals. I’m not a great experimentalist with my hands, and one night I set
the lab on fire.
Lu: A biofilm is essentially a three-dimensional community of bacteria that live together, kind of like a bacterial apartment building or city. Biofilms are made up of the
bacterial cells as well as all sorts of other material — carbohydrates, proteins, and so on — that the bacteria build to protect themselves.
Lu: Before I started medical school, I didn’t think bacterial infections were a big deal, because I assumed antibiotics had taken care of them, but then I started seeing
patients with significant biofilm infections that couldn’t be cured.
Lu: We use viruses called phages that infect bacteria but not human cells. We cut the phages’ DNA and insert a synthetic gene into the phage genome. That gene
produces enzymes that can go out into the biofilm and chew it up.
Discover: If you had just $10 for entertainment, how would you spend your day?
Lu: What can you even buy with $10? Maybe I would buy a magnifying glass and just peer around in the soil to see what other life was going on down there. That would
actually be fun.
www.tecconcursos.com.br/questoes/291932
by Scientific American
Top physicists have recently reached a frenzy over the announcement that the Large Hadron Collider in Geneva is planning to release what is widely expected to be
tantalizing - although no conclusive - evidence for the existence of the Higgs boson, the elementary particle hypothesized to be the origin of the mass of all matter.
Many physicists have already swung into action, swapping rumors about the contents of the announcement and proposing grand ideas about what those rumors would
mean, if true. “It’s impossible to be excited enough,” says Gordon Kane, a theoretical physicist at the University of Michigan at Ann Arbor.
The spokespeople of the collaborations using the cathedral-size ATLAS and CMS detectors to search for the Higgs boson and other phenomena at the 27-kilometer-
circumference proton accelerator of the Large Hadron Collider (LHC) are scheduled to present updates based on analyses of the data collected to date. “There won’t be a
discovery announcement, but it does promise to be interesting, since there are rumors that scientists have seen hints of the elusive Higgs boson” says James Gillies,
spokesperson for CERN (European Organization for Nuclear Research), which hosts the LHC.
Joe Lykken, a theoretical physicist at Fermi National Accelerator Laboratory in Batavia, Ill, and a member of the CMS collaboration, says: “Whatever happens eventually
with the Higgs, I think we’ll look back on this meeting and say. ‘This was the beginning of something.’” (As a CMS member, Lykken says he is not yet sure himself what
results ATLAS would unveil; he is bound by his collaboration’s rules not to reveal what CMS has in hand.)
www.tecconcursos.com.br/questoes/291933
by Scientific American
Top physicists have recently reached a frenzy over the announcement that the Large Hadron Collider in Geneva is planning to release what is widely expected to be
tantalizing - although no conclusive - evidence for the existence of the Higgs boson, the elementary particle hypothesized to be the origin of the mass of all matter.
Many physicists have already swung into action, swapping rumors about the contents of the announcement and proposing grand ideas about what those rumors would
mean, if true. “It’s impossible to be excited enough,” says Gordon Kane, a theoretical physicist at the University of Michigan at Ann Arbor.
The spokespeople of the collaborations using the cathedral-size ATLAS and CMS detectors to search for the Higgs boson and other phenomena at the 27-kilometer-
circumference proton accelerator of the Large Hadron Collider (LHC) are scheduled to present updates based on analyses of the data collected to date. “There won’t be a
discovery announcement, but it does promise to be interesting, since there are rumors that scientists have seen hints of the elusive Higgs boson” says James Gillies,
spokesperson for CERN (European Organization for Nuclear Research), which hosts the LHC.
Joe Lykken, a theoretical physicist at Fermi National Accelerator Laboratory in Batavia, Ill, and a member of the CMS collaboration, says: “Whatever happens eventually
with the Higgs, I think we’ll look back on this meeting and say. ‘This was the beginning of something.’” (As a CMS member, Lykken says he is not yet sure himself what
results ATLAS would unveil; he is bound by his collaboration’s rules not to reveal what CMS has in hand.)
The excerpt “Many physicists have already swung into action” could be properly completed in
a) yesterday after they heard the rumors.
b) before they heard the rumors.
c) since they heard the rumors.
d) if they hear the rumors.
e) when they will hear the rumors.
www.tecconcursos.com.br/questoes/291937
by Scientific American
Top physicists have recently reached a frenzy over the announcement that the Large Hadron Collider in Geneva is planning to release what is widely expected to be
Many physicists have already swung into action, swapping rumors about the contents of the announcement and proposing grand ideas about what those rumors would
mean, if true. “It’s impossible to be excited enough,” says Gordon Kane, a theoretical physicist at the University of Michigan at Ann Arbor.
The spokespeople of the collaborations using the cathedral-size ATLAS and CMS detectors(a) to search for the Higgs boson and other phenomena(b) at the 27-kilometer-
circumference proton accelerator of the Large Hadron Collider (LHC) are scheduled to present updates based on analyses of the data collected to date(c). “There won’t be
a discovery announcement, but it does promise to be interesting(d), since there are rumors that scientists have seen hints of the elusive Higgs boson(e)” says James Gillies,
spokesperson for CERN (European Organization for Nuclear Research), which hosts the LHC.
Joe Lykken, a theoretical physicist at Fermi National Accelerator Laboratory in Batavia, Ill, and a member of the CMS collaboration, says: “Whatever happens eventually
with the Higgs, I think we’ll look back on this meeting and say. ‘This was the beginning of something.’” (As a CMS member, Lykken says he is not yet sure himself what
results ATLAS would unveil; he is bound by his collaboration’s rules not to reveal what CMS has in hand.)
a) “using the cathedral-size ATLAS and CMS detectors,”– has as its subject “the spokespeople of the collaboration”.
b) “and other phenomena”– has a word whose plural form is phenomenon.
c) “based on analyses of the data collected to date.”– means the analyses collected up to that time.
d) “it does promise to be interesting”– hasan auxiliary verb used for emphasis.
e) “have seen hints of the elusive Higgs boson”– has words whose synonyms are respectively cues and obscure
www.tecconcursos.com.br/questoes/291939
by Scientific American
Top physicists have recently reached a frenzy over the announcement that the Large Hadron Collider in Geneva is planning to release what is widely expected to be
tantalizing - although no conclusive - evidence for the existence of the Higgs boson, the elementary particle hypothesized to be the origin of the mass of all matter.
Many physicists have already swung into action, swapping rumors about the contents of the announcement and proposing grand ideas about what those rumors would
mean, if true. “It’s impossible to be excited enough,” says Gordon Kane, a theoretical physicist at the University of Michigan at Ann Arbor.
The spokespeople of the collaborations using the cathedral-size ATLAS and CMS detectors(a) to search for the Higgs boson and other phenomena(b) at the 27-kilometer-
circumference proton accelerator of the Large Hadron Collider (LHC) are scheduled to present updates based on analyses of the data collected to date(c). “There won’t be
a discovery announcement, but it does promise to be interesting(d), since there are rumors that scientists have seen hints of the elusive Higgs boson(e)” says James Gillies,
spokesperson for CERN (European Organization for Nuclear Research), which hosts the LHC.
Joe Lykken, a theoretical physicist at Fermi National Accelerator Laboratory in Batavia, Ill, and a member of the CMS collaboration, says: “Whatever happens eventually
with the Higgs, I think we’ll look back on this meeting and say. ‘This was the beginning of something.’” (As a CMS member, Lykken says he is not yet sure himself what
results ATLAS would unveil; he is bound by his collaboration’s rules not to reveal what CMS has in hand.)
a) Dr. Higgs is bond by the collaboration’s rules and therefore should keep quiet.
b) even not knowing what will come, he believes science will reach a turning point with the Higgs news.
c) he will be free to talk about the news after ATLAS releases it.
d) he is doubtful about the real importance of the Higgs.
e) the theoretical physicists at Fermi National Accelerator Laboratory in Batavia will look back on the meeting about Dr. Higgs.
www.tecconcursos.com.br/questoes/349123
Kathy Murdock
Wednesday, February 9, 2011
Unhappy at work? Then you aren’t alone. The annual Conference Board job satisfaction survey shows that more than half of all Americans (a whopping 52 percent!) are
dissatisfied with their jobs. It’s not necessarily the work that is making us unhappy, though; sometimes, it is how we decide to look at and deal with our tasks that cause us
stress on the job.
As we know, pessimism is never a good trait, and boredom makes us do things we would never think of doing. And anxiety? If you spend most of your time worrying about
what has to be done or how you did something that has already been completed, you’ll never be able to completely move forward to the next task.
Mental frustrations aren’t the only things that give us pause at the office. Sitting too long in an office chair, never seeing sunlight from 9-5, becoming sedentary day after
day, and eating poorly on the job can all take their negative toll.
Brant Secunda and Mark Allen, authors of the book Fit Soul, Fit Body, offer today’s working mothers (and fathers) tips for feeling better while at the workplace.
Get Up!
If you sit in your office chair from 9-5 you’ll reduce the amount of lipoprotein lipase, a fat-burning enzyme, by 94 percent. Standing for 30 minutes each day will get this
enzyme going. The authors suggest rising from that ergonomic chair to answer the phone, consult with a coworker, or read the latest article.
Ever wonder how a top athlete can practice the same skill day after day, or how someone can force her body (and mind)to run 26.2 miles? Much of this willpower and
stamina comes from the mind. To continually do the same thing over and over, the person doing the task needs to think positive and, the authors say, embrace the power
of repetition.
Look toward what it is you are accomplishing, the ultimate goal, and not at the small steps it takes to get you there.
Oftentimes if I am dreading a big project, I will find other things to do to occupy my time while I get up my strength to work on it. The authors say that committing to
working on the project for five minutes is all you need, because once you start you will probably find it is not that bad after all. Even if it is that bad, you are doing it, and it
will be easier to complete if you have been chipping away at it for five minutes a day. Besides, once we do get started we usually stick to it because we want to see it
through. So, suck it up and jump in there, even if it is only for a short period of time.
Oftentimes you might find yourself thinking negative thoughts about the workplace. You don’t want to do a certain project because it is too hard or time-consuming; you
don’t want to have to partner up with a certain person because they don’t share the weight on projects; you are unhappy with the way the boss handles issues around the
workplace.
Instead of practicing negative thoughts, learn to weight lift for the soul, which the authors say is “giving up negative thoughts that weigh you down.” How do you do this?
“The next time a negative thought comes into your mind,” write the authors, “force yourself to restate it to yourself in a positive way.” For instance, if you are thinking
something is too hard, look at it as though you have what it takes to get the job done. If you find yourself considering a particular task a waste of your time, instead think
about what it is you can learn from doing that task.
www.tecconcursos.com.br/questoes/349125
Kathy Murdock
Wednesday, February 9, 2011
Unhappy at work? Then you aren’t alone. The annual Conference Board job satisfaction survey shows that more than half of all Americans (a whopping 52 percent!) are
dissatisfied with their jobs. It’s not necessarily the work that is making us unhappy, though; sometimes, it is how we decide to look at and deal with our tasks that cause us
stress on the job.
As we know, pessimism is never a good trait, and boredom makes us do things we would never think of doing. And anxiety? If you spend most of your time worrying about
what has to be done or how you did something that has already been completed, you’ll never be able to completely move forward to the next task.
Mental frustrations aren’t the only things that give us pause at the office. Sitting too long in an office chair, never seeing sunlight from 9-5, becoming sedentary day after
day, and eating poorly on the job can all take their negative toll.
Brant Secunda and Mark Allen, authors of the book Fit Soul, Fit Body, offer today’s working mothers (and fathers) tips for feeling better while at the workplace.
Get Up!
If you sit in your office chair from 9-5 you’ll reduce the amount of lipoprotein lipase, a fat-burning enzyme, by 94 percent. Standing for 30 minutes each day will get this
enzyme going. The authors suggest rising from that ergonomic chair to answer the phone, consult with a coworker, or read the latest article.
Ever wonder how a top athlete can practice the same skill day after day, or how someone can force her body (and mind) to run 26.2 miles? Much of this willpower and
stamina comes from the mind. To continually do the same thing over and over, the person doing the task needs to think positive and, the authors say, embrace the power
of repetition.
Look toward what it is you are accomplishing, the ultimate goal, and not at the small steps it takes to get you there.
Oftentimes if I am dreading a big project, I will find other things to do to occupy my time while I get up my strength to work on it. The authors say that committing to
working on the project for five minutes is all you need, because once you start you will probably find it is not that bad after all. Even if it is that bad, you are doing it, and it
will be easier to complete if you have been chipping away at it for five minutes a day. Besides, once we do get started we usually stick to it because we want to see it
through. So, suck it up and jump in there, even if it is only for a short period of time.
Oftentimes you might find yourself thinking negative thoughts about the workplace. You don’t want to do a certain project because it is too hard or time-consuming; you
Instead of practicing negative thoughts, learn to weight lift for the soul, which the authors say is “giving up negative thoughts that weigh you down.” How do you do this?
“The next time a negative thought comes into your mind,” write the authors, “force yourself to restate it to yourself in a positive way.” For instance, if you are thinking
something is too hard, look at it as though you have what it takes to get the job done. If you find yourself considering a particular task a waste of your time, instead think
about what it is you can learn from doing that task.
The only recommendation that is NOT present in the text is that workers should
a) rise from their chairs daily for half an hour.
b) be prepared to use repetition to their own benefit.
c) think positive and keep focused on their main target.
d) reserve long periods of time every day to work on big projects.
e) take five minutes a day to focus on hard tasks and finish them faster.
www.tecconcursos.com.br/questoes/349137
Kathy Murdock
Wednesday, February 9, 2011
Unhappy at work? Then you aren’t alone. The annual Conference Board job satisfaction survey shows that more than half of all Americans (a whopping 52 percent!) are
dissatisfied with their jobs. It’s not necessarily the work that is making us unhappy, though; sometimes, it is how we decide to look at and deal with our tasks that cause us
stress on the job.
As we know, pessimism is never a good trait, and boredom makes us do things we would never think of doing. And anxiety? If you spend most of your time worrying about
what has to be done or how you did something that has already been completed, you’ll never be able to completely move forward to the next task.
Mental frustrations aren’t the only things that give us pause at the office. Sitting too long in an office chair, never seeing sunlight from 9-5, becoming sedentary day after
day, and eating poorly on the job can all take their negative toll.
Brant Secunda and Mark Allen, authors of the book Fit Soul, Fit Body, offer today’s working mothers (and fathers) tips for feeling better while at the workplace.
Get Up!
If you sit in your office chair from 9-5 you’ll reduce the amount of lipoprotein lipase, a fat-burning enzyme, by 94 percent. Standing for 30 minutes each day will get this
enzyme going. The authors suggest rising from that ergonomic chair to answer the phone, consult with a coworker, or read the latest article.
Ever wonder how a top athlete can practice the same skill day after day, or how someone can force her body (and mind) to run 26.2 miles? Much of this willpower and
stamina comes from the mind. To continually do the same thing over and over, the person doing the task needs to think positive and, the authors say, embrace the power
of repetition.
Look toward what it is you are accomplishing, the ultimate goal, and not at the small steps it takes to get you there.
Oftentimes if I am dreading a big project, I will find other things to do to occupy my time while I get up my strength to work on it. The authors say that committing to
working on the project for five minutes is all you need, because once you start you will probably find it is not that bad after all. Even if it is that bad, you are doing it, and it
will be easier to complete if you have been chipping away at it for five minutes a day. Besides, once we do get started we usually stick to it because we want to see it
through. So, suck it up and jump in there, even if it is only for a short period of time.
Oftentimes you might find yourself thinking negative thoughts about the workplace. You don’t want to do a certain project because it is too hard or time-consuming; you
don’t want to have to partner up with a certain person because they don’t share the weight on projects; you are unhappy with the way the boss handles issues around the
workplace.
Instead of practicing negative thoughts, learn to weight lift for the soul, which the authors say is “giving up negative thoughts that weigh you down.” How do you do this?
“The next time a negative thought comes into your mind,” write the authors, “force yourself to restate it to yourself in a positive way.” For instance, if you are thinking
something is too hard, look at it as though you have what it takes to get the job done. If you find yourself considering a particular task a waste of your time, instead think
about what it is you can learn from doing that task.
Kathy Murdock
Wednesday, February 9, 2011
Unhappy at work? Then you aren’t alone. The annual Conference Board job satisfaction survey shows that more than half of all Americans (a whopping 52 percent!) are
dissatisfied with their jobs. It’s not necessarily the work that is making us unhappy, though; sometimes, it is how we decide to look at and deal with our tasks that cause us
stress on the job.
As we know, pessimism is never a good trait, and boredom makes us do things we would never think of doing. And anxiety? If you spend most of your time worrying about
what has to be done or how you did something that has already been completed, you’ll never be able to completely move forward to the next task.
Mental frustrations aren’t the only things that give us pause at the office. Sitting too long in an office chair, never seeing sunlight from 9-5, becoming sedentary day after
day, and eating poorly on the job can all take their negative toll.
Brant Secunda and Mark Allen, authors of the book Fit Soul, Fit Body, offer today’s working mothers (and fathers) tips for feeling better while at the workplace.
Get Up!
If you sit in your office chair from 9-5 you’ll reduce the amount of lipoprotein lipase, a fat-burning enzyme, by 94 percent. Standing for 30 minutes each day will get this
enzyme going. The authors suggest rising from that ergonomic chair to answer the phone, consult with a coworker, or read the latest article.
Ever wonder how a top athlete can practice the same skill day after day, or how someone can force her body (and mind) to run 26.2 miles? Much of this willpower and
stamina comes from the mind. To continually do the same thing over and over, the person doing the task needs to think positive and, the authors say, embrace the power
of repetition.
Look toward what it is you are accomplishing, the ultimate goal, and not at the small steps it takes to get you there.
Oftentimes if I am dreading a big project, I will find other things to do to occupy my time while I get up my strength to work on it. The authors say that committing to
working on the project for five minutes is all you need, because once you start you will probably find it is not that bad after all. Even if it is that bad, you are doing it, and it
will be easier to complete if you have been chipping away at it for five minutes a day. Besides, once we do get started we usually stick to it because we want to see it
through. So, suck it up and jump in there, even if it is only for a short period of time.
Oftentimes you might find yourself thinking negative thoughts about the workplace. You don’t want to do a certain project because it is too hard or time-consuming; you
don’t want to have to partner up with a certain person because they don’t share the weight on projects; you are unhappy with the way the boss handles issues around the
workplace.
Instead of practicing negative thoughts, learn to weight lift for the soul, which the authors say is “giving up negative thoughts that weigh you down.” How do you do this?
“The next time a negative thought comes into your mind,” write the authors, “force yourself to restate it to yourself in a positive way.” For instance, if you are thinking
something is too hard, look at it as though you have what it takes to get the job done. If you find yourself considering a particular task a waste of your time, instead think
about what it is you can learn from doing that task.
The sentence in which the boldfaced expression introduces an idea of exemplification is:
www.tecconcursos.com.br/questoes/349253
By Heather Huhman
There’s a debate going on among career experts about which is more important: skillset or mindset. While skills are certainly desirable for many positions, does having the
right ones guarantee you’ll get the job?
What if you have the mindset to get the work accomplished, but currently lack certain skills requested by the employer? Jennifer Fremont-Smith, CEO of Smarterer, and
Paul G. Stoltz, PhD, coauthor of Put Your Mindset to Work: The One Asset You Really Need to Win and Keep the Job You Love, recently sat down with U.S. News to sound
off on this issue.
Jennifer: For many jobs, skillset needs to come first. The employer absolutely must find people who have the hard skills to do whatever it is they are being hired to do.
Programmers have to know how to program. Data analysts need to know how to crunch numbers in Excel. Marketers must know their marketing tools and software. Social
media managers must know the tools of their trade like Twitter, Facebook, WordPress, and have writing and communication skills.
After the employers have identified candidates with these hard skills, they can shift their focus to their candidates’ mindsets - attitude, integrity, work ethic, personality, etc.
Jennifer: Despite record high unemployment, many jobs sit empty because employers can’t find candidates with the right skills. In a recent survey cited in the Wall Street
Journal, over 50 percent of companies reported difficulty finding applicants with the right skills. Companies are running lean and mean in this economy – they don’t have
the time to train for those key skills.
Paul: [Co-author James Reed and I] asked tens of thousands of top employers worldwide this question: If you were hiring someone today, which would you pick, A) the
person with the perfect skills and qualifications, but lacking the desired mindset, or B) the person with the desired mindset, but lacking the rest? Ninety-eight percent pick
A. Add to this that 97 percent said it is more likely that a person with the right mindset will develop the right skillset, rather than the other way around.
Jennifer: At Smarterer, we define skillset as the set of digital, social, and technical tools professionals use to be effective in the workforce. Professionals are rapidly
accumulating these skills, and the tools themselves are proliferating and evolving – we’re giving people a simple, smart way for people to validate their skillset and
articulate it to the world.
Paul: We define mindset as “the lens through which you see and navigate life.” It undergirds and affects all that you think, see, believe, say, and do.
Heather: How can job seekers show they have the skillset employers are seeking throughout the entire hiring process?
Jennifer: At the beginning of the process, seekers can showcase the skills they have by incorporating them, such as their Smarterer scores, throughout their professional
and personal brand materials. They should be articulating their skills in their resume, cover letter, LinkedIn profile, blog, website - everywhere they express their
professional identity.
Heather: How can job seekers show they have the mindset employers are seeking throughout the entire hiring process?
Paul: One of the most head-spinning studies we did, which was conducted by an independent statistician showed that, out of 30,000 CVs/resumes, when you look at who
gets the job and who does not:
A. The conventional wisdom fails (at best). None of the classic, accepted advice, like using action verbs or including hobbies/interests actually made any difference.
B. The only factor that made the difference was that those who had one of the 72 mindset qualities from our master model, articulated in their CV/resume, in a specific
way, were three times as likely to get the job. Furthermore, those who had two or more of these statements, were seven times more likely to get the job, often over other
more qualified candidates.
www.tecconcursos.com.br/questoes/349254
By Heather Huhman
There’s a debate going on among career experts about which is more important: skillset or mindset. While skills are certainly desirable for many positions, does having the
right ones guarantee you’ll get the job?
What if you have the mindset to get the work accomplished, but currently lack certain skills requested by the employer? Jennifer Fremont-Smith, CEO of Smarterer, and
Paul G. Stoltz, PhD, coauthor of Put Your Mindset to Work: The One Asset You Really Need to Win and Keep the Job You Love, recently sat down with U.S. News to sound
off on this issue.
Jennifer: For many jobs, skillset needs to come first. The employer absolutely must find people who have the hard skills to do whatever it is they are being hired to do.
Programmers have to know how to program. Data analysts need to know how to crunch numbers in Excel. Marketers must know their marketing tools and software. Social
media managers must know the tools of their trade like Twitter, Facebook, WordPress, and have writing and communication skills.
After the employers have identified candidates with these hard skills, they can shift their focus to their candidates’ mindsets - attitude, integrity, work ethic, personality, etc.
Jennifer: Despite record high unemployment, many jobs sit empty because employers can’t find candidates with the right skills. In a recent survey cited in the Wall Street
Journal, over 50 percent of companies reported difficulty finding applicants with the right skills. Companies are running lean and mean in this economy – they don’t have
the time to train for those key skills.
Paul: [Co-author James Reed and I] asked tens of thousands of top employers worldwide this question: If you were hiring someone today, which would you pick, B) the
Jennifer: At Smarterer, we define skillset as the set of digital, social, and technical tools professionals use to be effective in the workforce. Professionals are rapidly
accumulating these skills, and the tools themselves are proliferating and evolving – we’re giving people a simple, smart way for people to validate their skillset and
articulate it to the world.
Paul: We define mindset as “the lens through which you see and navigate life.” It undergirds and affects all that you think, see, believe, say, and do.
Heather: How can job seekers show they have the skillset employers are seeking throughout the entire hiring process?
Jennifer: At the beginning of the process, seekers can showcase the skills they have by incorporating them, such as their Smarterer scores, throughout their professional
and personal brand materials. They should be articulating their skills in their resume, cover letter, LinkedIn profile, blog, website - everywhere they express their
professional identity.
Heather: How can job seekers show they have the mindset employers are seeking throughout the entire hiring process?
Paul: One of the most head-spinning studies we did, which was conducted by an independent statistician showed that, out of 30,000 CVs/resumes, when you look at who
gets the job and who does not:
A. The conventional wisdom fails (at best). None of the classic, accepted advice, like using action verbs or including hobbies/interests actually made any difference.
B. The only factor that made the difference was that those who had one of the 72 mindset qualities from our master model, articulated in their CV/resume, in a specific
way, were three times as likely to get the job. Furthermore, those who had two or more of these statements, were seven times more likely to get the job, often over other
more qualified candidates.
Jennifer Fremont-Smith and Paul G. Stoltz are both interviewed in this article because they
a) have written books on how to conquer a dream job.
b) are chief executives from renowned American companies.
c) have identical points of view and experiences about the necessary qualifications in an employee.
d) show different perspectives concerning what employers value in a job candidate.
e) agree that all employers value the same set of technical skills in all employees.
www.tecconcursos.com.br/questoes/349256
By Heather Huhman
There’s a debate going on among career experts about which is more important: skillset or mindset. While skills are certainly desirable for many positions, does having the
right ones guarantee you’ll get the job?
What if you have the mindset to get the work accomplished, but currently lack certain skills requested by the employer? Jennifer Fremont-Smith, CEO of Smarterer, and
Paul G. Stoltz, PhD, coauthor of Put Your Mindset to Work: The One Asset You Really Need to Win and Keep the Job You Love, recently sat down with U.S. News to sound
off on this issue.
Jennifer: For many jobs, skillset needs to come first. The employer absolutely must find people who have the hard skills to do whatever it is they are being hired to do.
Programmers have to know how to program. Data analysts need to know how to crunch numbers in Excel. Marketers must know their marketing tools and software. Social
media managers must know the tools of their trade like Twitter, Facebook, WordPress, and have writing and communication skills.
After the employers have identified candidates with these hard skills, they can shift their focus to their candidates’ mindsets - attitude, integrity, work ethic, personality, etc.
Jennifer: Despite record high unemployment, many jobs sit empty because employers can’t find candidates with the right skills. In a recent survey cited in the Wall Street
Journal, over 50 percent of companies reported difficulty finding applicants with the right skills. Companies are running lean and mean in this economy – they don’t have
the time to train for those key skills.
Paul: [Co-author James Reed and I] asked tens of thousands of top employers worldwide this question: If you were hiring someone today, which would you pick, B) the
person with the perfect skills and qualifications, but lacking the desired mindset, or A) the person with the desired mindset, but lacking the rest? Ninety-eight percent pick
A. Add to this that 97 percent said it is more likely that a person with the right mindset will develop the right skillset, rather than the other way around.
Jennifer: At Smarterer, we define skillset as the set of digital, social, and technical tools professionals use to be effective in the workforce. Professionals are rapidly
accumulating these skills, and the tools themselves are proliferating and evolving – we’re giving people a simple, smart way for people to validate their skillset and
articulate it to the world.
Paul: We define mindset as “the lens through which you see and navigate life.” It undergirds and affects all that you think, see, believe, say, and do.
Heather: How can job seekers show they have the skillset employers are seeking throughout the entire hiring process?
Jennifer: At the beginning of the process, seekers can showcase the skills they have by incorporating them, such as their Smarterer scores, throughout their professional
and personal brand materials. They should be articulating their skills in their resume, cover letter, LinkedIn profile, blog, website - everywhere they express their
professional identity.
Heather: How can job seekers show they have the mindset employers are seeking throughout the entire hiring process?
Paul: One of the most head-spinning studies we did, which was conducted by an independent statistician showed that, out of 30,000 CVs/resumes, when you look at who
gets the job and who does not:
A. The conventional wisdom fails (at best). None of the classic, accepted advice, like using action verbs or including hobbies/interests actually made any difference.
B. The only factor that made the difference was that those who had one of the 72 mindset qualities from our master model, articulated in their CV/resume, in a specific
way, were three times as likely to get the job. Furthermore, those who had two or more of these statements, were seven times more likely to get the job, often over other
more qualified candidates.
a) today’s employers intend to invest large sums of money training new employees.
b) most employees nowadays are indifferent to the use of digital, social and technical tools in the workplace.
c) candidates should be able to display and present their skills in different formats that will be seen by prospective employers.
d) many employers consider it unnecessary to learn about the job seekers’ attitudes, integrity and personality.
e) no company nowadays can find employees with the hard skills required by the job market.
www.tecconcursos.com.br/questoes/349258
By Heather Huhman
There’s a debate going on among career experts about which is more important: skillset or mindset. While skills are certainly desirable for many positions, does having the
right ones guarantee you’ll get the job?
What if you have the mindset to get the work accomplished, but currently lack certain skills requested by the employer? Jennifer Fremont-Smith, CEO of Smarterer, and
Paul G. Stoltz, PhD, coauthor of Put Your Mindset to Work: The One Asset You Really Need to Win and Keep the Job You Love, recently sat down with U.S. News to sound
off on this issue.
Jennifer: For many jobs, skillset needs to come first. The employer absolutely must find people who have the hard skills to do whatever it is they are being hired to do.
Programmers have to know how to program. Data analysts need to know how to crunch numbers in Excel. Marketers must know their marketing tools and software. Social
media managers must know the tools of their trade like Twitter, Facebook, WordPress, and have writing and communication skills.
After the employers have identified candidates with these hard skills, they can shift their focus to their candidates’ mindsets - attitude, integrity, work ethic, personality, etc.
Jennifer: Despite record high unemployment, many jobs sit empty because employers can’t find candidates with the right skills. In a recent survey cited in the Wall Street
Journal, over 50 percent of companies reported difficulty finding applicants with the right skills. Companies are running lean and mean in this economy – they don’t have
the time to train for those key skills.
Paul: [Co-author James Reed and I] asked tens of thousands of top employers worldwide this question: If you were hiring someone today, which would you pick, B) the
person with the perfect skills and qualifications, but lacking the desired mindset, or A) the person with the desired mindset, but lacking the rest? Ninety-eight percent pick
A. Add to this that 97 percent said it is more likely that a person with the right mindset will develop the right skillset, rather than the other way around.
Jennifer: At Smarterer, we define skillset as the set of digital, social, and technical tools professionals use to be effective in the workforce. Professionals are rapidly
accumulating these skills, and the tools themselves are proliferating and evolving – we’re giving people a simple, smart way for people to validate their skillset and
articulate it to the world.
Paul: We define mindset as “the lens through which you see and navigate life.” It undergirds and affects all that you think, see, believe, say, and do.
Heather: How can job seekers show they have the skillset employers are seeking throughout the entire hiring process?
Jennifer: At the beginning of the process, seekers can showcase the skills they have by incorporating them, such as their Smarterer scores, throughout their professional
and personal brand materials. They should be articulating their skills in their resume, cover letter, LinkedIn profile, blog, website - everywhere they express their
professional identity.
Heather: How can job seekers show they have the mindset employers are seeking throughout the entire hiring process?
Paul: One of the most head-spinning studies we did, which was conducted by an independent statistician showed that, out of 30,000 CVs/resumes, when you look at who
gets the job and who does not:
A. The conventional wisdom fails (at best). None of the classic, accepted advice, like using action verbs or including hobbies/interests actually made any difference.
B. The only factor that made the difference was that those who had one of the 72 mindset qualities from our master model, articulated in their CV/resume, in a specific
way, were three times as likely to get the job. Furthermore, those who had two or more of these statements, were seven times more likely to get the job, often over other
more qualified candidates.
www.tecconcursos.com.br/questoes/349260
By Heather Huhman
There’s a debate going on among career experts about which is more important: skillset or mindset. While skills are certainly desirable for many positions, does having the
right ones guarantee you’ll get the job?
What if you have the mindset to get the work accomplished, but currently lack certain skills requested by the employer? Jennifer Fremont-Smith, CEO of Smarterer, and
Paul G. Stoltz, PhD, coauthor of Put Your Mindset to Work: The One Asset You Really Need to Win and Keep the Job You Love, recently sat down with U.S. News to sound
off on this issue.
Jennifer: For many jobs, skillset needs to come first. The employer absolutely must find people who have the hard skills to do whatever it is they are being hired to do.
Programmers have to know how to program. Data analysts need to know how to crunch numbers in Excel. Marketers must know their marketing tools and software. Social
media managers must know the tools of their trade like Twitter, Facebook, WordPress, and have writing and communication skills.
After the employers have identified candidates with these hard skills, they can shift their focus to their candidates’ mindsets - attitude, integrity, work ethic, personality, etc.
Jennifer: Despite record high unemployment, many jobs sit empty because employers can’t find candidates with the right skills. In a recent survey cited in the Wall Street
Journal, over 50 percent of companies reported difficulty finding applicants with the right skills. Companies are running lean and mean in this economy – they don’t have
the time to train for those key skills.
Paul: [Co-author James Reed and I] asked tens of thousands of top employers worldwide this question: If you were hiring someone today, which would you pick, B) the
person with the perfect skills and qualifications, but lacking the desired mindset, or A) the person with the desired mindset, but lacking the rest? Ninety-eight percent pick
A. Add to this that 97 percent said it is more likely that a person with the right mindset will develop the right skillset, rather than the other way around.
Jennifer: At Smarterer, we define skillset as the set of digital, social, and technical tools professionals use to be effective in the workforce. Professionals are rapidly
accumulating these skills, and the tools themselves are proliferating and evolving – we’re giving people a simple, smart way for people to validate their skillset and
articulate it to the world.
Paul: We define mindset as “the lens through which you see and navigate life.” It undergirds and affects all that you think, see, believe, say, and do.
Heather: How can job seekers show they have the skillset employers are seeking throughout the entire hiring process?
Jennifer: At the beginning of the process, seekers can showcase the skills they have by incorporating them, such as their Smarterer scores, throughout their professional
and personal brand materials. They should be articulating their skills in their resume, cover letter, LinkedIn profile, blog, website - everywhere they express their
professional identity.
Heather: How can job seekers show they have the mindset employers are seeking throughout the entire hiring process?
Paul: One of the most head-spinning studies we did, which was conducted by an independent statistician showed that, out of 30,000 CVs/resumes, when you look at who
gets the job and who does not:
A. The conventional wisdom fails (at best). None of the classic, accepted advice, like using action verbs or including hobbies/interests actually made any difference.
B. The only factor that made the difference was that those who had one of the 72 mindset qualities from our master model, articulated in their CV/resume, in a specific
way, were three times as likely to get the job. Furthermore, those who had two or more of these statements, were seven times more likely to get the job, often over other
more qualified candidates.
The pronoun they in “they don’t have time to train for those key skills.” refers to
a) “employers”
b) “candidates”
c) “companies”
www.tecconcursos.com.br/questoes/349263
By Heather Huhman
There’s a debate going on among career experts about which is more important: skillset or mindset. While skills are certainly desirable for many positions, does having the
right ones guarantee you’ll get the job?
What if you have the mindset to get the work accomplished, but currently lack certain skills requested by the employer? Jennifer Fremont-Smith, CEO of Smarterer, and
Paul G. Stoltz, PhD, coauthor of Put Your Mindset to Work: The One Asset You Really Need to Win and Keep the Job You Love, recently sat down with U.S. News to sound
off on this issue.
Jennifer: For many jobs, skillset needs to come first. The employer absolutely must find people who have the hard skills to do whatever it is they are being hired to do.
Programmers have to know how to program. Data analysts need to know how to crunch numbers in Excel. Marketers must know their marketing tools and software. Social
media managers must know the tools of their trade like Twitter, Facebook, WordPress, and have writing and communication skills.
After the employers have identified candidates with these hard skills, they can shift their focus to their candidates’ mindsets - attitude, integrity, work ethic, personality, etc.
Jennifer: Despite record high unemployment, many jobs sit empty because employers can’t find candidates with the right skills. In a recent survey cited in the Wall Street
Journal, over 50 percent of companies reported difficulty finding applicants with the right skills. Companies are running lean and mean in this economy – they don’t have
the time to train for those key skills.
Paul: [Co-author James Reed and I] asked tens of thousands of top employers worldwide this question: If you were hiring someone today, which would you pick, B) the
person with the perfect skills and qualifications, but lacking the desired mindset, or A) the person with the desired mindset, but lacking the rest? Ninety-eight percent pick
A. Add to this that 97 percent said it is more likely that a person with the right mindset will develop the right skillset, rather than the other way around.
Jennifer: At Smarterer, we define skillset as the set of digital, social, and technical tools professionals use to be effective in the workforce. Professionals are rapidly
accumulating these skills, and the tools themselves are proliferating and evolving – we’re giving people a simple, smart way for people to validate their skillset and
articulate it to the world.
Paul: We define mindset as “the lens through which you see and navigate life.” It undergirds and affects all that you think, see, believe, say, and do.
Heather: How can job seekers show they have the skillset employers are seeking throughout the entire hiring process?
Jennifer: At the beginning of the process, seekers can showcase the skills they have by incorporating them, such as their Smarterer scores, throughout their professional
and personal brand materials. They should be articulating their skills in their resume, cover letter, LinkedIn profile, blog, website - everywhere they express their
professional identity.
Heather: How can job seekers show they have the mindset employers are seeking throughout the entire hiring process?
Paul: One of the most head-spinning studies we did, which was conducted by an independent statistician showed that, out of 30,000 CVs/resumes, when you look at who
gets the job and who does not:
A. The conventional wisdom fails (at best). None of the classic, accepted advice, like using action verbs or including hobbies/interests actually made any difference.
B. The only factor that made the difference was that those who had one of the 72 mindset qualities from our master model, articulated in their CV/resume, in a specific
way, were three times as likely to get the job. Furthermore, those who had two or more of these statements, were seven times more likely to get the job, often over other
more qualified candidates.
www.tecconcursos.com.br/questoes/349265
By Heather Huhman
There’s a debate going on among career experts about which is more important: skillset or mindset. While skills are certainly desirable for many positions, does having the
right ones guarantee you’ll get the job?
What if you have the mindset to get the work accomplished, but currently lack certain skills requested by the employer? Jennifer Fremont-Smith, CEO of Smarterer, and
Paul G. Stoltz, PhD, coauthor of Put Your Mindset to Work: The One Asset You Really Need to Win and Keep the Job You Love, recently sat down with U.S. News to sound
off on this issue.
Jennifer: For many jobs, skillset needs to come first. The employer absolutely must find people who have the hard skills to do whatever it is they are being hired to do.
Programmers have to know how to program. Data analysts need to know how to crunch numbers in Excel. Marketers must know their marketing tools and software. Social
media managers must know the tools of their trade like Twitter, Facebook, WordPress, and have writing and communication skills.
After the employers have identified candidates with these hard skills, they can shift their focus to their candidates’ mindsets - attitude, integrity, work ethic, personality, etc.
Jennifer: Despite record high unemployment, many jobs sit empty because employers can’t find candidates with the right skills. In a recent survey cited in the Wall Street
Journal, over 50 percent of companies reported difficulty finding applicants with the right skills. Companies are running lean and mean in this economy – they don’t have
the time to train for those key skills.
Paul: [Co-author James Reed and I] asked tens of thousands of top employers worldwide this question: If you were hiring someone today, which would you pick, B) the
person with the perfect skills and qualifications, but lacking the desired mindset, or A) the person with the desired mindset, but lacking the rest? Ninety-eight percent pick
A. Add to this that 97 percent said it is more likely that a person with the right mindset will develop the right skillset, rather than the other way around.
Jennifer: At Smarterer, we define skillset as the set of digital, social, and technical tools professionals use to be effective in the workforce. Professionals are rapidly
accumulating these skills, and the tools themselves are proliferating and evolving – we’re giving people a simple, smart way for people to validate their skillset and
articulate it to the world.
Paul: We define mindset as “the lens through which you see and navigate life.” It undergirds and affects all that you think, see, believe, say, and do.
Heather: How can job seekers show they have the skillset employers are seeking throughout the entire hiring process?
Jennifer: At the beginning of the process, seekers can showcase the skills they have by incorporating them, such as their Smarterer scores, throughout their professional
and personal brand materials. They should be articulating their skills in their resume, cover letter, LinkedIn profile, blog, website - everywhere they express their
professional identity.
Heather: How can job seekers show they have the mindset employers are seeking throughout the entire hiring process?
Paul: One of the most head-spinning studies we did, which was conducted by an independent statistician showed that, out of 30,000 CVs/resumes, when you look at who
gets the job and who does not:
A. The conventional wisdom fails (at best). None of the classic, accepted advice, like using action verbs or including hobbies/interests actually made any difference.
B. The only factor that made the difference was that those who had one of the 72 mindset qualities from our master model, articulated in their CV/resume, in a specific
way, were three times as likely to get the job. Furthermore, those who had two or more of these statements, were seven times more likely to get the job, often over other
more qualified candidates.
The study mentioned by Paul Stoltz shows that, to get a job, candidates must
a) mention in their CVs or resumes at least one mindset quality from a pre-selected group identified in Stoltz’s model.
b) show they are qualified applicants for the function by making a list of their seven best mindset qualities.
c) list their 72 most relevant aptitudes and capabilities, in accordance with Stoltz’s master model.
d) send their resumes three times to the same employer before being accepted.
e) use action verbs and report on hobbies and interests in their resumes.
www.tecconcursos.com.br/questoes/349268
By Heather Huhman
There’s a debate going on among career experts about which is more important: skillset or mindset. While skills are certainly desirable for many positions, does having the
right ones guarantee you’ll get the job?
What if you have the mindset to get the work accomplished, but currently lack certain skills requested by the employer? Jennifer Fremont-Smith, CEO of Smarterer, and
Paul G. Stoltz, PhD, coauthor of Put Your Mindset to Work: The One Asset You Really Need to Win and Keep the Job You Love, recently sat down with U.S. News to sound
off on this issue.
Jennifer: For many jobs, skillset needs to come first. The employer absolutely must find people who have the hard skills to do whatever it is they are being hired to do.
Programmers have to know how to program. Data analysts need to know how to crunch numbers in Excel. Marketers must know their marketing tools and software. Social
media managers must know the tools of their trade like Twitter, Facebook, WordPress, and have writing and communication skills.
After the employers have identified candidates with these hard skills, they can shift their focus to their candidates’ mindsets - attitude, integrity, work ethic, personality, etc.
Jennifer: Despite record high unemployment, many jobs sit empty because employers can’t find candidates with the right skills. In a recent survey cited in the Wall Street
Journal, over 50 percent of companies reported difficulty finding applicants with the right skills. Companies are running lean and mean in this economy – they don’t have
the time to train for those key skills.
Paul: [Co-author James Reed and I] asked tens of thousands of top employers worldwide this question: If you were hiring someone today, which would you pick, B) the
person with the perfect skills and qualifications, but lacking the desired mindset, or A) the person with the desired mindset, but lacking the rest? Ninety-eight percent pick
A. Add to this that 97 percent said it is more likely that a person with the right mindset will develop the right skillset, rather than the other way around.
Jennifer: At Smarterer, we define skillset as the set of digital, social, and technical tools professionals use to be effective in the workforce. Professionals are rapidly
accumulating these skills, and the tools themselves are proliferating and evolving – we’re giving people a simple, smart way for people to validate their skillset and
articulate it to the world.
Paul: We define mindset as “the lens through which you see and navigate life.” It undergirds and affects all that you think, see, believe, say, and do.
Heather: How can job seekers show they have the skillset employers are seeking throughout the entire hiring process?
Jennifer: At the beginning of the process, seekers can showcase the skills they have by incorporating them, such as their Smarterer scores, throughout their professional
and personal brand materials. They should be articulating their skills in their resume, cover letter, LinkedIn profile, blog, website - everywhere they express their
professional identity.
Heather: How can job seekers show they have the mindset employers are seeking throughout the entire hiring process?
Paul: One of the most head-spinning studies we did, which was conducted by an independent statistician showed that, out of 30,000 CVs/resumes, when you look at who
gets the job and who does not:
A. The conventional wisdom fails (at best). None of the classic, accepted advice, like using action verbs or including hobbies/interests actually made any difference.
B. The only factor that made the difference was that those who had one of the 72 mindset qualities from our master model, articulated in their CV/resume, in a specific
way, were three times as likely to get the job. Furthermore, those who had two or more of these statements, were seven times more likely to get the job, often over other
more qualified candidates.
According to Jennifer Fremont-Smith and Paul G. Stoltz, mindset includes all of the following EXCEPT
www.tecconcursos.com.br/questoes/352340
Today’s meeting is really about you. I can stand in front of you and talk about working safely and what procedures to follow until I’m blue in the face. But until you
understand the need for working safely, until you are willing to be responsible for your safety, it doesn’t mean a whole lot.
Some of you may be familiar with OSHA - the Occupational Safety & Health Administration. The sole purpose of this agency is to keep American workers safe. Complying
with OSHA regulations isn’t always easy, but if we work together, we can do it. Yet, complying with regulations is not the real reason for working safely. Our real motive is
simple. We care about each and every one of you and will do what is necessary to prevent you from being injured.
However, keeping our workplace safe takes input from everyone. Management, supervisor, and all of you have to come together on this issue, or we’re in trouble. For
example, upper management has to approve the purchase of safe equipment. Supervisors, including myself, have to ensure that each of you knows how to use that
equipment safely. Then it’s up to you to follow through the task and use the equipment as you were trained. If any one part of this chain fails, accidents are going to
happen and people are going to get hurt.
At the core of your safety responsibilities lies the task of recognizing safety and health hazards. In order to do that, you must first understand what constitutes a hazard.
Extreme hazards are often obvious. Our hopes are that you won’t find too many of those around here.
There are, however, more subtle hazards that won’t jump up and bite you. As a result of your safety training and meetings like these, some things may come to mind. For
example, a machine may not be easy to lock out. Common practice may be to use a tag. This is a potential hazard and should be discussed. Maybe something can be
changed to make it easier to use a lock. Other subtle hazards include such things as frayed electrical cords, a loose machine guard, a cluttered aisle, or maybe something
that just doesn’t look right.
A big part of recognizing hazards is using your instincts. Nobody knows your job as well as you do, so we’re counting on you to let us know about possible problems.
Beyond recognizing hazards, you have to correct them or report them to someone who can. This too, is a judgement call. For example, if something spills in your work
area you can probably clean it up yourself. However, if there is an unlabeled chemical container and you have no idea what it is, you should report it to your supervisor.
Good housekeeping is a major part of keeping your work area safe. For example, you should take a few minutes each day to ensure that aisles, hallways, and stairways in
your work area are not obstructed. If boxes, equipment, or anything else is left to pile up, you have a tripping hazard on your hands. Those obstructions could keep you
from exiting the building quickly and safely should you face an emergency situation.
Also watch out for spills. These can lead to slips and falls. Flammable materials are another thing to be aware of. Make sure they are disposed of properly.
Keep Thinking. Even if you’re doing your job safely and you are avoiding hazards, there are often even better ways to work safely. If you have ideas for improving the
safety of your job or that of co-workers, share them.
Concluding Remarks
While nothing we do can completely eliminate the threat of an incident, we can work together to improve our odds. As I said, this must be a real team effort and I’m
counting on input from all of you. Let’s keep communicating and continue to improve safety.
Available at: <http://www.ncsu.edu/ehs/www99/right/training/ meeting/emplores.html>. Retrieved on: April 1st, 2012. Adapted.
a) blame supervisors and managers who cannot use equipment safely in the office.
b) inform employees that the use of instincts is all it takes to prevent dangers at work.
c) present OSHA to American workers who had never heard about this organization.
d) argue that the acquisition of modern and safer equipment can prevent all job accidents.
e) encourage the cooperation of all employees so as to prevent dangers in the workplace.
www.tecconcursos.com.br/questoes/352342
Today’s meeting is really about you. I can stand in front of you and talk about working safely and what procedures to follow until I’m blue in the face. But until you
understand the need for working safely, until you are willing to be responsible for your safety, it doesn’t mean a whole lot.
Some of you may be familiar with OSHA - the Occupational Safety & Health Administration. The sole purpose of this agency is to keep American workers safe. Complying
with OSHA regulations isn’t always easy, but if we work together, we can do it. Yet, complying with regulations is not the real reason for working safely. Our real motive is
simple. We care about each and every one of you and will do what is necessary to prevent you from being injured.
However, keeping our workplace safe takes input from everyone. Management, supervisor, and all of you have to come together on this issue, or we’re in trouble. For
example, upper management has to approve the purchase of safe equipment. Supervisors, including myself, have to ensure that each of you knows how to use that
equipment safely. Then it’s up to you to follow through the task and use the equipment as you were trained. If any one part of this chain fails, accidents are going to
happen and people are going to get hurt.
At the core of your safety responsibilities lies the task of recognizing safety and health hazards. In order to do that, you must first understand what constitutes a hazard.
Extreme hazards are often obvious. Our hopes are that you won’t find too many of those around here.
There are, however, more subtle hazards that won’t jump up and bite you. As a result of your safety training and meetings like these, some things may come to mind. For
example, a machine may not be easy to lock out. Common practice may be to use a tag. This is a potential hazard and should be discussed. Maybe something can be
changed to make it easier to use a lock. Other subtle hazards include such things as frayed electrical cords, a loose machine guard, a cluttered aisle, or maybe something
that just doesn’t look right.
A big part of recognizing hazards is using your instincts. Nobody knows your job as well as you do, so we’re counting on you to let us know about possible problems.
Beyond recognizing hazards, you have to correct them or report them to someone who can. This too, is a judgement call. For example, if something spills in your work
area you can probably clean it up yourself. However, if there is an unlabeled chemical container and you have no idea what it is, you should report it to your supervisor.
Good housekeeping is a major part of keeping your work area safe. For example, you should take a few minutes each day to ensure that aisles, hallways, and stairways in
your work area are not obstructed. If boxes, equipment, or anything else is left to pile up, you have a tripping hazard on your hands. Those obstructions could keep you
from exiting the building quickly and safely should you face an emergency situation.
Also watch out for spills. These can lead to slips and falls. Flammable materials are another thing to be aware of. Make sure they are disposed of properly.
Keep Thinking. Even if you’re doing your job safely and you are avoiding hazards, there are often even better ways to work safely. If you have ideas for improving the
safety of your job or that of co-workers, share them.
Concluding Remarks
While nothing we do can completely eliminate the threat of an incident, we can work together to improve our odds. As I said, this must be a real team effort and I’m
counting on input from all of you. Let’s keep communicating and continue to improve safety.
Available at: <http://www.ncsu.edu/ehs/www99/right/training/ meeting/emplores.html>. Retrieved on: April 1st, 2012. Adapted.
The fragment ‘all of you have to come together on this issue, or we’re in trouble." is understood as a(n)
a) funny joke
b) call to action
c) violent threat
d) ineffective request
www.tecconcursos.com.br/questoes/352345
Today’s meeting is really about you. I can stand in front of you and talk about working safely and what procedures to follow until I’m blue in the face. But until you
understand the need for working safely, until you are willing to be responsible for your safety, it doesn’t mean a whole lot.
Some of you may be familiar with OSHA - the Occupational Safety & Health Administration. The sole purpose of this agency is to keep American workers safe. Complying
with OSHA regulations isn’t always easy, but if we work together, we can do it. Yet, complying with regulations is not the real reason for working safely. Our real motive is
simple. We care about each and every one of you and will do what is necessary to prevent you from being injured.
However, keeping our workplace safe takes input from everyone. Management, supervisor, and all of you have to come together on this issue, or we’re in trouble. For
example, upper management has to approve the purchase of safe equipment. Supervisors, including myself, have to ensure that each of you knows how to use that
equipment safely. Then it’s up to you to follow through the task and use the equipment as you were trained. If any one part of this chain fails, accidents are going to
happen and people are going to get hurt.
At the core of your safety responsibilities lies the task of recognizing safety and health hazards. In order to do that, you must first understand what constitutes a hazard.
Extreme hazards are often obvious. Our hopes are that you won’t find too many of those around here.
There are, however, more subtle hazards that won’t jump up and bite you. As a result of your safety training and meetings like these, some things may come to mind. For
example, a machine may not be easy to lock out. Common practice may be to use a tag. This is a potential hazard and should be discussed. Maybe something can be
changed to make it easier to use a lock. Other subtle hazards include such things as frayed electrical cords, a loose machine guard, a cluttered aisle, or maybe something
that just doesn’t look right.
A big part of recognizing hazards is using your instincts. Nobody knows your job as well as you do, so we’re counting on you to let us know about possible problems.
Beyond recognizing hazards, you have to correct them or report them to someone who can. This too, is a judgement call. For example, if something spills in your work
area you can probably clean it up yourself. However, if there is an unlabeled chemical container and you have no idea what it is, you should report it to your supervisor.
Good housekeeping is a major part of keeping your work area safe. For example, you should take a few minutes each day to ensure that aisles, hallways, and stairways in
your work area are not obstructed. If boxes, equipment, or anything else is left to pile up, you have a tripping hazard on your hands. Those obstructions could keep you
from exiting the building quickly and safely should you face an emergency situation.
Also watch out for spills. These can lead to slips and falls. Flammable materials are another thing to be aware of. Make sure they are disposed of properly.
Keep Thinking. Even if you’re doing your job safely and you are avoiding hazards, there are often even better ways to work safely. If you have ideas for improving the
safety of your job or that of co-workers, share them.
Concluding Remarks
While nothing we do can completely eliminate the threat of an incident, we can work together to improve our odds. As I said, this must be a real team effort and I’m
counting on input from all of you. Let’s keep communicating and continue to improve safety.
Available at: <http://www.ncsu.edu/ehs/www99/right/training/ meeting/emplores.html>. Retrieved on: April 1st, 2012. Adapted.
According to the text, employees have several safety responsibilities at work, EXCEPT
a) understanding what constitutes a hazard.
b) using their instincts to help prevent risks.
c) avoiding obstructed spaces in the work area.
d) eliminating the use of all flammable materials.
e) correcting dangers or reporting on them to have them solved.
www.tecconcursos.com.br/questoes/352349
Today’s meeting is really about you. I can stand in front of you and talk about working safely and what procedures to follow until I’m blue in the face. But until you
understand the need for working safely, until you are willing to be responsible for your safety, it doesn’t mean a whole lot.
Some of you may be familiar with OSHA - the Occupational Safety & Health Administration. The sole purpose of this agency is to keep American workers safe. Complying
with OSHA regulations isn’t always easy, but if we work together, we can do it. Yet, complying with regulations is not the real reason for working safely. Our real motive is
simple. We care about each and every one of you and will do what is necessary to prevent you from being injured.
However, keeping our workplace safe takes input from everyone. Management, supervisor, and all of you have to come together on this issue, or we’re in trouble. For
example, upper management has to approve the purchase of safe equipment. Supervisors, including myself, have to ensure that each of you knows how to use that
equipment safely. Then it’s up to you to follow through the task and use the equipment as you were trained. If any one part of this chain fails, accidents are going to
happen and people are going to get hurt.
At the core of your safety responsibilities lies the task of recognizing safety and health hazards. In order to do that, you must first understand what constitutes a hazard.
Extreme hazards are often obvious. Our hopes are that you won’t find too many of those around here.
There are, however, more subtle hazards that won’t jump up and bite you. As a result of your safety training and meetings like these, some things may come to mind. For
example, a machine may not be easy to lock out. Common practice may be to use a tag. This is a potential hazard and should be discussed. Maybe something can be
changed to make it easier to use a lock. Other subtle hazards include such things as frayed electrical cords, a loose machine guard, a cluttered aisle, or maybe something
that just doesn’t look right.
A big part of recognizing hazards is using your instincts. Nobody knows your job as well as you do, so we’re counting on you to let us know about possible problems.
Beyond recognizing hazards, you have to correct them or report them to someone who can. This too, is a judgement call. For example, if something spills in your work
area you can probably clean it up yourself. However, if there is an unlabeled chemical container and you have no idea what it is, you should report it to your supervisor.
Good housekeeping is a major part of keeping your work area safe. For example, you should take a few minutes each day to ensure that aisles, hallways, and stairways in
your work area are not obstructed. If boxes, equipment, or anything else is left to pile up, you have a tripping hazard on your hands. Those obstructions could keep you
from exiting the building quickly and safely should you face an emergency situation.
Also watch out for spills. These can lead to slips and falls. Flammable materials are another thing to be aware of. Make sure they are disposed of properly.
Keep Thinking. Even if you’re doing your job safely and you are avoiding hazards, there are often even better ways to work safely. If you have ideas for improving the
safety of your job or that of co-workers, share them.
Concluding Remarks
While nothing we do can completely eliminate the threat of an incident, we can work together to improve our odds. As I said, this must be a real team effort and I’m
counting on input from all of you. Let’s keep communicating and continue to improve safety.
Available at: <http://www.ncsu.edu/ehs/www99/right/training/ meeting/emplores.html>. Retrieved on: April 1st, 2012. Adapted.
a) believes that labor risks cannot be reduced by team efforts and commitment.
b) expects to be kept informed of potential situations that may be dangerous.
c) considers the cooperation of workers an irrelevant measure to improve safety at work.
d) defends that corporate management is accountable for all issues regarding safety at work.
e) feels that co-workers’ suggestions are useless in identifying hazards in the work environment.
www.tecconcursos.com.br/questoes/1345531
The discovery of a giant oil accumulation in ultradeep waters off Brazil’s southeast coast is opening a new frontier for exploration and production. This pre-salt play, in the
Santos basin, contains potentially recoverable reserves ranging from 795 million m3 to 1.3 billion m3 of oil equivalent. Just one of several structures found beneath a thick
layer of salt, the Tupi structure, is pushing technological boundaries as E&P teams seek to define its geographic limits.
Types of reserves
Pre-salt, postsalt and subsalt formations are all capable of forming traps and seals for migrating hydrocarbons. Pre-salt wells target reservoirs beneath the layer of
autochthonous salt. Subsalt wells target reservoirs beneath the mobile allochthonous salt canopy. Postsalt wells target formations above the salt.
Geology
From a geologic perspective, this play is a product of interminably slow tectonic and depositional processes involving continental rifting, seafloor spreading, and
sedimentation. These processes were associated with the split between South America and Africa during the Cretaceous breakup of Gondwana. The depositional
processes created source, reservoir, and sealed layers necessary to successfully produce an active petroleum system.
Technology
From a technological perspective, the feasibility of the pre-salt play is a result of operator experience gained through overcoming the challenges of constructing wells in
deep and ultradeep waters off the coast of Brazil. Just as important are improvements to seismic imaging, which allow geophysicists to identify potential structures masked
beneath layered evaporites that may be as thick as 2,000 m [6,560 ft].
E&P challenges
Expertise and techniques developed to exploit deepwater fields of the Campos basin have been adapted to wells in the Santos basin. Exploration models from the Santos
basin pre-salt play, in turn, have led to significant discoveries in neighboring basins. This article discusses the geology and history of Brazil’s pre-salt play. It describes
challenges associated with exploration and production of presalt carbonate reservoirs and their impact on the advancement of new models.
d) studying the Tupi structure to assess the extent of a petroleum bearing formation.
www.tecconcursos.com.br/questoes/1345533
The discovery of a giant oil accumulation in ultradeep waters off Brazil’s southeast coast is opening a new frontier for exploration and production. This pre-salt play, in the
Santos basin, contains potentially recoverable reserves ranging from 795 million m3 to 1.3 billion m3 of oil equivalent. Just one of several structures found beneath a thick
layer of salt, the Tupi structure, is pushing technological boundaries as E&P teams seek to define its geographic limits.
Types of reserves
Pre-salt, postsalt and subsalt formations are all capable of forming traps and seals for migrating hydrocarbons. Pre-salt wells target reservoirs beneath the layer of
autochthonous salt. Subsalt wells target reservoirs beneath the mobile allochthonous salt canopy. Postsalt wells target formations above the salt.
Geology
From a geologic perspective, this play is a product of interminably slow tectonic and depositional processes involving continental rifting, seafloor spreading, and
sedimentation. These processes were associated with the split between South America and Africa during the Cretaceous breakup of Gondwana. The depositional
processes created source, reservoir, and sealed layers necessary to successfully produce an active petroleum system.
Technology
From a technological perspective, the feasibility of the pre-salt play is a result of operator experience gained through overcoming the challenges of constructing wells in
deep and ultradeep waters off the coast of Brazil. Just as important are improvements to seismic imaging, which allow geophysicists to identify potential structures masked
beneath layered evaporites that may be as thick as 2,000 m [6,560 ft].
E&P challenges
Expertise and techniques developed to exploit deepwater fields of the Campos basin have been adapted to wells in the Santos basin. Exploration models from the Santos
basin pre-salt play, in turn, have led to significant discoveries in neighboring basins. This article discusses the geology and history of Brazil’s pre-salt play. It describes
challenges associated with exploration and production of presalt carbonate reservoirs and their impact on the advancement of new models.
In Text I, while comparing pre-salt, subsalt and postsalt, the author states that
a) the functional quality of the respective reserves are potentially equivalent.
www.tecconcursos.com.br/questoes/1345538
The discovery of a giant oil accumulation in ultradeep waters off Brazil’s southeast coast is opening a new frontier for exploration and production. This pre-salt play, in the
Santos basin, contains potentially recoverable reserves ranging from 795 million m3 to 1.3 billion m3 of oil equivalent. Just one of several structures found beneath a thick
layer of salt, the Tupi structure, is pushing technological boundaries as E&P teams seek to define its geographic limits.
Types of reserves
Pre-salt, postsalt and subsalt formations are all capable of forming traps and seals for migrating hydrocarbons. Pre-salt wells target reservoirs beneath the layer of
autochthonous salt. Subsalt wells target reservoirs beneath the mobile allochthonous salt canopy. Postsalt wells target formations above the salt.
Geology
Technology
From a technological perspective, the feasibility of the pre-salt play is a result of operator experience gained through overcoming the challenges of constructing wells in
deep and ultradeep waters off the coast of Brazil. Just as important are improvements to seismic imaging, which allow geophysicists to identify potential structures masked
beneath layered evaporites that may be as thick as 2,000 m [6,560 ft].
E&P challenges
Expertise and techniques developed to exploit deepwater fields of the Campos basin have been adapted to wells in the Santos basin. Exploration models from the Santos
basin pre-salt play, in turn, have led to significant discoveries in neighboring basins. This article discusses the geology and history of Brazil’s pre-salt play. It describes
challenges associated with exploration and production of presalt carbonate reservoirs and their impact on the advancement of new models.
b) masking of potential structures and the identification of formations in deep and ultradeep waters.
d) construction of weels in deep waters and the feasibility of the pre-salt play.
e) operator’s well-construction experience and the enhancement in the image processing of seismic activity.
www.tecconcursos.com.br/questoes/1345555
The Consortium of block BM-C-32 in Brazil, operated by BP - British Petroleum (NYSE:BP,LON:BP) that owns 40%, Anadarko (NYSE:APC,FRA:AAZ), with 33.3% and Maersk
Oil (CPH:MARSK B,PINK:AMKBF) with the remaining 26.7%, announced today (November 9, 2011) the successful Itaipu-2 pre-salt appraisal well.
Located in block BM-C-32 in the Campos Basin offshore Brazil, the well was drilled to total depth of approximately 4,877 meters in 1,420 meters of water, and encountered
a gross petroleum column of approximately 18 meters in a pre-salt carbonate reservoir.
According to Anadarko’s press release of December 17, 2009, originally this consortium was operated by Devon Energy Corp. (NYSE:DVN) with 40% working interest,
Anadarko with 33.3% and SK Energy Co., Ltd. (Private) holds the Remaining 26.7% working interest.
“The pre-salt Itaipu-2 well is an aggressive stepout from the Itaipu discovery well, which is located 4 miles (7 kilometers) northwest,” Anadarko Sr. Vice President,
Worldwide Exploration, Bob Daniels said. “The Itaipu-2 well established a fluid contact and appears to have successfully extended the accumulation 120 meters downdip
from the discovery. Accordingly, the appraisal well significantly increases the areal extent of the vast Itaipu field, and we believe incorporating the data from both the
appraisal well and the original discovery well we will increase our previous resource estimates for the field. We are very pleased with these results and look forward to
continuing our activity on the block.”
In Text II, in relation to block BM-C-32, it was announced on November 9, 2011 that
www.tecconcursos.com.br/questoes/1345558
The Consortium of block BM-C-32 in Brazil, operated by BP - British Petroleum (NYSE:BP,LON:BP) that owns 40%, Anadarko (NYSE:APC,FRA:AAZ), with 33.3% and Maersk
Oil (CPH:MARSK B,PINK:AMKBF) with the remaining 26.7%, announced today (November 9, 2011) the successful Itaipu-2 pre-salt appraisal well.
Located in block BM-C-32 in the Campos Basin offshore Brazil, the well was drilled to total depth of approximately 4,877 meters in 1,420 meters of water, and encountered
a gross petroleum column of approximately 18 meters in a pre-salt carbonate reservoir.
According to Anadarko’s press release of December 17, 2009, originally this consortium was operated by Devon Energy Corp. (NYSE:DVN) with 40% working interest,
Anadarko with 33.3% and SK Energy Co., Ltd. (Private) holds the Remaining 26.7% working interest.
“The pre-salt Itaipu-2 well is an aggressive stepout from the Itaipu discovery well, which is located 4 miles (7 kilometers) northwest,” Anadarko Sr. Vice President,
Worldwide Exploration, Bob Daniels said. “The Itaipu-2 well established a fluid contact and appears to have successfully extended the accumulation 120 meters downdip
from the discovery. Accordingly, the appraisal well significantly increases the areal extent of the vast Itaipu field, and we believe incorporating the data from both the
appraisal well and the original discovery well we will increase our previous resource estimates for the field. We are very pleased with these results and look forward to
continuing our activity on the block.”
According to Text II, the pre-salt Itaipu-2 well is an aggressive step-out from the Itaipu Discovery well because it
a) has established a way for the fluid to step out of the well.
e) will encounter a column with a large quantity of gross petroleum together with Itaipu Discovery well.
www.tecconcursos.com.br/questoes/1345562
The discovery of a giant oil accumulation in ultradeep waters off Brazil’s southeast coast is opening a new frontier for exploration and production. This pre-salt play, in the
Santos basin, contains potentially recoverable reserves ranging from 795 million m3 to 1.3 billion m3 of oil equivalent. Just one of several structures found beneath a thick
layer of salt, the Tupi structure, is pushing technological boundaries as E&P teams seek to define its geographic limits.
Types of reserves
Pre-salt, postsalt and subsalt formations are all capable of forming traps and seals for migrating hydrocarbons. Pre-salt wells target reservoirs beneath the layer of
autochthonous salt. Subsalt wells target reservoirs beneath the mobile allochthonous salt canopy. Postsalt wells target formations above the salt.
Geology
From a geologic perspective, this play is a product of interminably slow tectonic and depositional processes involving continental rifting, seafloor spreading, and
sedimentation. These processes were associated with the split between South America and Africa during the Cretaceous breakup of Gondwana. The depositional
processes created source, reservoir, and sealed layers necessary to successfully produce an active petroleum system.
Technology
From a technological perspective, the feasibility of the pre-salt play is a result of operator experience gained through overcoming the challenges of constructing wells in
deep and ultradeep waters off the coast of Brazil. Just as important are improvements to seismic imaging, which allow geophysicists to identify potential structures masked
beneath layered evaporites that may be as thick as 2,000 m [6,560 ft].
E&P challenges
Expertise and techniques developed to exploit deepwater fields of the Campos basin have been adapted to wells in the Santos basin. Exploration models from the Santos
basin pre-salt play, in turn, have led to significant discoveries in neighboring basins. This article discusses the geology and history of Brazil’s pre-salt play. It describes
challenges associated with exploration and production of presalt carbonate reservoirs and their impact on the advancement of new models.
Text II
The Consortium of block BM-C-32 in Brazil, operated by BP - British Petroleum (NYSE:BP,LON:BP) that owns 40%, Anadarko (NYSE:APC,FRA:AAZ), with 33.3% and Maersk
Oil (CPH:MARSK B,PINK:AMKBF) with the remaining 26.7%, announced today (November 9, 2011) the successful Itaipu-2 pre-salt appraisal well.
Located in block BM-C-32 in the Campos Basin offshore Brazil, the well was drilled to total depth of approximately 4,877 meters in 1,420 meters of water, and encountered
a gross petroleum column of approximately 18 meters in a pre-salt carbonate reservoir.
According to Anadarko’s press release of December 17, 2009, originally this consortium was operated by Devon Energy Corp. (NYSE:DVN) with 40% working interest,
Anadarko with 33.3% and SK Energy Co., Ltd. (Private) holds the Remaining 26.7% working interest.
“The pre-salt Itaipu-2 well is an aggressive stepout from the Itaipu discovery well, which is located 4 miles (7 kilometers) northwest,” Anadarko Sr. Vice President,
Worldwide Exploration, Bob Daniels said. “The Itaipu-2 well established a fluid contact and appears to have successfully extended the accumulation 120 meters downdip
from the discovery. Accordingly, the appraisal well significantly increases the areal extent of the vast Itaipu field, and we believe incorporating the data from both the
appraisal well and the original discovery well we will increase our previous resource estimates for the field. We are very pleased with these results and look forward to
continuing our activity on the block.”
c) While Text I approaches the subject mostly theoretically, Text II provides an instance of a felicitous application of what is stated in Text I.
Gabarito
1) D 2) B 3) C 4) B 5) D 6) E 7) B
8) A 9) B 10) D 11) D 12) B 13) E 14) C
15) B 16) E 17) A 18) B 19) A 20) A 21) E
22) E 23) A 24) D 25) C 26) B 27) C 28) D
29) B 30) B 31) C 32) C 33) E 34) A 35) A
36) B 37) A 38) E 39) B 40) C 41) D 42) C
43) B 44) D 45) D 46) B 47) A 48) A 49) D
50) B 51) A 52) E 53) C 54) B 55) A 56) D
57) D 58) B 59) E 60) A 61) D 62) B 63) C
64) E 65) A 66) D 67) A 68) E 69) B 70) E
71) D 72) B 73) C 74) A 75) B 76) C 77) B
78) B 79) E 80) D 81) A 82) E 83) E 84) A
85) B 86) D 87) E 88) B 89) C 90) C 91) E
92) D 93) D 94) B 95) A 96) C 97) E 98) E
99) E 100) C 101) A 102) C 103) D 104) B 105) A
106) D 107) B 108) D 109) E 110) A 111) B 112) A
113) C 114) C 115) B 116) D 117) D 118) A 119) B
120) C 121) E 122) B 123) D 124) E 125) C 126) A
127) E 128) C 129) B 130) B 131) A 132) E 133) E
134) B 135) D 136) E 137) E 138) D 139) D 140) D
141) A 142) C 143) E 144) B 145) A 146) E 147) D
148) B 149) D 150) A 151) E 152) A 153) E 154) B
155) D 156) D 157) A 158) B 159) D 160) D 161) C
162) D 163) E 164) A 165) C 166) E 167) A 168) D
169) D 170) E 171) C 172) D 173) D 174) D 175) E
176) C 177) B 178) B 179) E 180) D 181) C 182) D
183) B 184) D 185) C 186) E 187) C 188) E 189) A
190) A 191) E 192) B 193) D 194) B 195) D 196) A
197) E 198) B 199) B 200) C