Shyam's Slide Share Presentations

VIRTUAL LIBRARY "KNOWLEDGE - KORRIDOR"

This article/post is from a third party website. The views expressed are that of the author. We at Capacity Building & Development may not necessarily subscribe to it completely. The relevance & applicability of the content is limited to certain geographic zones.It is not universal.

TO VIEW MORE CONTENT ON THIS SUBJECT AND OTHER TOPICS, Please visit KNOWLEDGE-KORRIDOR our Virtual Library

Saturday, April 15, 2017

Video games can mitigate defensiveness resulting from bad test scores, study suggests 04-15




























Image credit ; Shyam's Imagination Library




One of the worst feelings a student can have is receiving a bad grade on an exam, whether it's a test they prepared well for or didn't prepare at all. The prevalence of video games in today's society helps mitigate some of the effects felt by students from those low test scores by reaffirming their abilities in another area they deem important.

Video game players can get temporarily lost in alternative worlds, whether it's transforming into the ultimate fighting machine or the greatest athlete on the planet. But no matter the game, the goal is to find a way to put the empty feeling of the bad test at school behind him by reaffirming his excellence in his favorite video game. It's a scene that plays out all across the country, and one that has received criticism at times for placing too much emphasis on the game and not enough on schoolwork.

But John Velez, an assistant professor in the Department of Journalism & Electronic Media in the Texas Tech University College of Media & Communication, says that may not always be the case. In fact, his research suggests those who value video game success as part of their identity and received positive feedback on their video game play were more willing to accept the bad test score and consider the implications of it, something that is crucial for taking steps to change study habits and ensure they do better on future exams.

Conversely, those who do not value success in video games but received positive video game feedback were actually more defensive to having performed poorly on a test. They were more likely to discredit the test and engage in self-protective behaviors. Regardless, the results seem to throw a wrench into the theory that video games and schoolwork are detrimental to each other.
The key, however, is making sure those playing video games after a bad test are not doing it just as an escape, but making sure after playing video games they understand why they did badly on the test and what they need to do to perform better on the next one.

"People always kind of talk about video-game play and schoolwork in a negative light," Velez said. "They talk about how playing video games in general can take away from academic achievement. But for me and a lot of gamers, it's just a part of life and we use it a lot of times to help get through the day and be more successful versus gaming to get away from life.

"What I wanted to look into was, for people who identify as a gamer and identify as being good at games, how they can use playing video games after something like a bad exam to help deal with the implications of a bad exam, which makes it more likely they will think about the implications and accept the idea that, 'OK, I didn't do well on this exam and I need to do better next time.'"

Negative results, positive affirmation Velez said past research suggests receiving negative feedback regarding a valued self-image brings about a defensive mechanism where people discredit or dismiss the source of the information. Conversely, the Self-Affirmation Theory says that affirming or bolstering an important self-image that is not related to the negative feedback can effectively reduce defensiveness.

"If you're in a bad mood, you can play a good game and get into a good mood," Velez said. "But I wanted to go deeper and think about how there are times when you are in a bad mood but you are in a bad mood for a very specific reason. Just kind of ignoring it and doing something to get into a good mood can be bad. It would be bad if you go home and play a video game to forget about it and the next time not prepare better for the test or not think about the last time you did badly on a test."

For the research, Velez was interested in two types of people -- those who identify as placing importance on good video game play and those who do not. How good they were at playing the game was not a factor, just that they identified as it being important to their identity or not.

Participants in the research were administered a survey to assess their motivations for video-game play and the importance of video games to their identity. They were then given an intelligence test and were told the test was a strong measure of intelligence. Upon completing the test, participants were given either negative feedback on their performance or no feedback at all.

That negative feedback naturally produces an amount of defensiveness for anyone regarding their performance, regardless of the importance they put on being successful at video games.

Participants then played a generic shooting video game for 15 minutes that randomly provided positive or no feedback to the player, and players were told the game was an adequate test of their video-game playing skills. Participants then completed an online survey containing ratings of the intelligence test and self-ratings on intelligence.

What Velez discovered was those who place importance on being successful at video games were less likely to be defensive about the poor performance on the intelligence test.

"Defensiveness is really a bad thing a lot of times," Velez said. "It doesn't allow you to think about the situation and think about what you should have done differently. A lot of times people use it to protect themselves and ignore it or move on, which makes it likely the same thing is going to happen over and over again."

It's the second discovery that Velez didn't expect, the result where those who performed badly on the intelligence exam and don't identify as video game players became even more defensive about their intelligence exam result. Instead, they were more likely to use the positive video game feedback as further evidence they are intelligent and the test is flawed or doesn't represent their true intelligence.

"That was like this double-edged sword that I didn't realize I was going to find," Velez said. "It was definitely unexpected, but once you think about it theoretically, it intuitively makes sense. After receiving negative information about yourself you instinctively start looking for a way to make yourself feel better and you usually take advantage of any opportunities in your immediate environment."

Changing behavior A common punishment administered by parents for inappropriate behavior or poor performance in school has been to take away things the child enjoys, such as television, the use of the car, or their video games.

One might infer from this research that taking video games from the child might actually be doing them harm by not allowing them to utilize the tool that makes them feel better or gives them an avenue to understand why they performed poorly in school and how they must do better.
Velez, however, said that's not necessarily the case.

"I don't think parents should change their practices until more research is conducted, particularly looking at younger players and their parents' unique parenting styles," Velez said. "The study simply introduces the idea that some people may benefit from some game play in which they perform well, which may make it easier for them to discuss and strategize for the future so they don't run into this problem again after playing."

Velez said the study also introduces specific stipulations about when the benefits of video-game play occur and when it may actually backfire.

"If parents know their child truly takes pride in their video-game skills, then their child may benefit from doing well in their favorite game before addressing the negative test grade," Velez said. "However, there's the strong possibility that a child is using the video game as a way to avoid the implications of a bad test grade, so I wouldn't suggest parents change how they parent their children until we're able to do more research."

Therein lies the fine line, because the study also suggests receiving positive feedback on video games doesn't necessarily translate into a better performance on a future exam. Velez said the common idea is that defensiveness prevents people from learning and adapting from the feedback they received. Those who are less defensive about negative self-information are more likely to consider the causes and precursors of the negative event, making it more likely a change in behavior will occur. But this was not a focus of this particular study and will have to be examined further.

Velez said he would also like to identify other characteristics of video-game players who are more likely to benefit from this process compared to increased negative defensive reaction. This could be used to help identify a coping strategy or lead to further research about parenting strategies for discussing sensitive subjects with children.

"What I want to get out of this research is, for people who care about gaming as part of their identity, how they can use video games in a positive way when dealing with negative things in life," Velez said.

View at the original source

Has the ‘Dream Run’ for Indian IT Ended? 04-15






After years of sitting on piles of cash, Indian information technology (IT) services firms are suddenly dispensing some of it to their shareholders by way of buybacks. In mid-February, Tata Consultancy Services (TCS), India’s largest IT services firm, which has a cash pile of around Rs.40,000 crore ($6 billion), announced that it would buy back equity shares worth up to Rs.16,000 crore ($2.4 billion).

This is TCS’ first buyback scheme since it went public 13 years ago and also the biggest share repurchase program in the country. (A few weeks before TCS’ announcement, Nasdaq-listed Cognizant Technology Solutions, which has the bulk of its workforce in India, declared a dividend payout and a share buyback of $3.4 billion.) In March, HCL Technologies said it would buy back Rs.3,500 crore ($340 million) of shares. Others like Wipro and Tech Mahindra are expected to follow suit. On April 13, announcing its results for the fourth quarter of fiscal 2017, Infosys said that up to Rs. 13,000 crore ($2 billion) is expected to be paid out to shareholders during 2018 in dividends, share buybacks or both. In addition, the company expects to pay out up to 70% of free cash flow next year in the same combination. Currently, Infosys pays out up to 50% of post-tax profits in dividends.

The buybacks are a move to boost share price and soothe investor sentiments. They are also designed to make them less attractive to predators. After years of giving high returns, the industry has been delivering below expectations; most Indian IT services firms have been performing below the Sensex, the benchmark stock index. Recent developments like U.S. President Donald Trump’s election and the ensuing controversy surrounding outsourcing and H1-B visas, and technology disruptions caused by digital transformation and automation are in fact threatening the very fundamentals of the $108 billion IT-BPO exports industry.

That industry put India on the world map because of its high-quality, low-cost tech talent and a successfully executed offshore-global delivery model. (Indian IT firms use the H-1B temporary work visas in large numbers to fly their engineers to client sites in the U.S., which is their largest market accounting for over 60% of exports.) There are also pressures from other quarters, such as Brexit and the consequent delays in decision making; slowdowns in the banking and financial services sector, and reduced discretionary IT spending.

The projections of industry body Nasscom (National Association of Software and Services Companies) mirrors the growing uncertainly. In sharp contrast to the heady growth of over 30% of previous years and in line with dipping growth in recent times, at the beginning of fiscal year (FY) April 2016-March 2017, Nasscom had forecast a growth of 10% to 12% (in constant currency terms). In November last year, it lowered the outlook to 8% to 10%. In February, for the first time in 25 years, Nasscom deferred giving the annual revenue outlook for fiscal 2018 by a quarter.

Other projections, too, are bleak. A few weeks ago, Goldman Sachs said that the revenues of the top five Indian IT services firms are likely to grow at a compound annual growth rate (CAGR) of 8% as compared to 11% during the FY 2011 to FY 2016 period. The U.S.–based Deep Dive/Everest Group IT services forecaster expects a 6.3% growth for the top five IT companies for calendar year 2017. For the industry as a whole (excluding multinational captive centers), the growth in 2017 is projected to be a mere 5.3%.

“For several years now, experts have been predicting that the dream run of the Indian IT services industry will soon be over. By all indications, that time has actually dawned now,” says Rishikesha T. Krishnan, director of the Indian Institute of Management Indore.

But this is not the first time that the industry is looking down a long dark tunnel. The Asian Crisis of 1997, the dot-com bubble burst of 2001 and the economic crisis of 2008 were all trying times. Each time, the industry managed to bounce back. So what is different this time around?

Lacking Strategic Relevance

Ravi Aron, professor of information systems at the Johns Hopkins Carey Business School, says Indian companies are struggling with a problem of strategic relevance. “The current protectionist regime in the U.S. and the anti-trade mood will result in legislations that may cause some temporary but not very large setbacks. The real problem for India IT services companies is that they occupy positions of very low strategic relevance with their clients.”
“For several years now, experts have been predicting that the dream run of the Indian IT services industry will soon be over. By all indications, that time has actually dawned now.” –Rishikesha Krishnan
Aron points out that several emerging technologies are changing how companies compete, the way they engage with customers and even the nature of work inside the firm. Big Data and analytics, artificial intelligence and robotics are all top of the mind not just for CTOs in corporations but also for all CXOs. “When we [business school faculty] talk to senior executives, they do not ask us to explain the difference between supervised and unsupervised learning in machine learning. Instead, they ask specific questions about how will machine learning have an impact on predicting customer response to products in retail financial services? Or, how can data mining be used to identify opportunities in new product development by analyzing and classifying patterns from transaction data?”

But Indian IT companies are operating on a different model altogether. They expect the clients to tell them what they want from these emergent paradigms and offer to find out a cost effective way of doing it. “They are not ready to deal with the ‘what aspects of business can I transform with technology’ question, which is of high strategic relevance,” says Aron.

Saikat Chaudhuri, executive director of Wharton’s Mack Institute for Innovation Management, adds: “Essentially, Indian IT firms have been stuck in the middle; they are not low-end providers anymore with low costs, neither have they been able to propel themselves to become high-end providers performing core work and high-margin services. At the same time, on the technology side, automation threatens to render obsolete much of the labor arbitrage work on the lower end; while political changes such as protectionism compound the problem.”

Keeping pace with technology and the changing requirements of clients is the most difficult challenge that the Indian IT industry is facing today, says D.D. Mishra, research director at IT research and advisory firm Gartner. Pointing out that the current situation is “very unique and we are possibly going through the most interesting phase of evolution in terms of IT services,” Mishra lists his key concerns: “We see that creative destruction has become a norm for many businesses. Re-skilling people is a big challenge, especially when you have a large workforce. The short supply of skilled labor will be one big inhibitor. Endpoints of the Internet of Things will grow at a CAGR of 32.9% from 2015 through 2020, reaching an installed base of 20.4 billion units. This will drive a lot changes in the business models and business opportunities which need to be tapped. And though tactical innovation is the strength of Indians, in my view, the cultural aspect around innovation is the most difficult change organizations will struggle with.”

Sudin Apte, CEO and research director at Offshore Insights, an IT advisory and research firm, says that Indian IT firms could survive the many challenges earlier — whether it was shortage of skills, fluctuating currency, macro-economic factors, growing competition from multinationals and pressure from clients to build skills such as domain expertise, program management and consulting capabilities — because “they had the benefit of the TINA (‘there is no alternative’) factor.”

But that is no longer true. Now, there are several point solutions available which are part of the enterprise resource planning ecosystem. Many business process providers offer specific business processes as well as cross industry processes on demand. Cloud and software-as–a-service (SaaS) companies are changing delivery and payment parameters. “The industry is facing structural changes. All aspects of a solution — what clients are buying, in what format they are buying, how they want to pay, what value they expect, competition — are undergoing change simultaneously. The gaps between what clients are looking for and what the Indian IT firms have to offer is widening. The industry has not faced such issues before,” says Apte.

He points to another disturbing trend: Even as global IT spending is growing, it’s not coming to India. Instead, most of it is going to other companies. “Look at the growth of firms like Salesforce.com, Amazon Web Services (AWS) and Workday. Even cloud divisions of Oracle and Microsoft Dynamics have been doing well and so are numerous firms like Tableau, Marketo, etc. There are around 200 or 250 companies which came from nowhere and are today in the range of $200 million to $1 billion,” says Apte.
“The real problem for Indian IT services companies is that they occupy positions of very low strategic relevance with their clients.” –Ravi Aron
New Skills Are Required

Krishnan believes that Indian IT firms were successful in riding multiple waves like the shift from mainframe to client-server, Y2K, internet and e-commerce, social media and the mobile because “the core skills needed to succeed didn’t change dramatically — essentially good programming skills plus the ability to manage large teams across geographies.” He notes that while the programming languages and platforms did change, the ability of Indian companies to train large numbers of software professionals in new programming languages in short timeframes allowed them to stay ahead.

However, the latest wave embracing big data, machine learning and artificial intelligence requires fundamentally different skills. It’s more research-intensive. “Many existing employees can’t be re-trained for these requirements. And India’s engineering education will be unable to meet these needs, at least not immediately,” says Krishnan. According to a recent McKinsey & Company report, more than half of the 3.9 million people employed in the Indian IT sector will become “irrelevant” in the next three to four years.

Ganesh Natarajan, industry veteran, chairman of Nasscom Foundation and founder and chairman of 5F World, a platform for skills, startups and social ventures in India, describes the current scenario as “a perfect storm” created by three forces. The first is digital transformation of clients with applications and infrastructure moving to the cloud and clients asking for new services like mobility, analytics and cyber security which cannot be delivered using the traditional dual shore model. The second is automation of knowledge work, which is seeing traditional manpower intensive offshore services like applications management, infrastructure support and testing becoming automated and reducing or, in some cases, eliminating the need for manpower. Third are the forces of protectionism that is leading to tightening of visas and making cross-border movement of people extremely arduous.

“Each of three forces can have severe ramifications for the Indian IT services industry. Digital transformation can take away as much as 20% of existing services volumes, automation can eliminate 30% of manpower and protectionism can reduce revenue opportunities and profitability by at least 10%,” says Natarajan.

Transform or Perish

Clearly, the rules have changed for Indian IT firms. The big question is: Can they in fact get back into the game?

Only if they differentiate themselves, says Kartik Hosanagar, Wharton professor of operations, information and decisions. He suggests two strategies. One, become a partner that can guide CEOs with strategic initiatives like digital transformation. This will require them to be part of the “what to do” and “why do it” conversations and not just “how to do it.” Two, specialize and build deep expertise in certain areas. For example, CMOs are increasingly spending on IT including custom IT implementations. Another such area is Big Data and analytics. “Organizing into divisions or perhaps into sub-brands, each with deep expertise, is the way to go,” he says.
According to a recent McKinsey & Company report, more than half of the 3.9 million people employed in the Indian IT sector will become “irrelevant” in the next three to four years.
Chaudhuri suggests that while Indian IT firms have been making investments over the past five years in emerging technologies, they now need to scale up those efforts and do so quickly. “They need to increase the investments in those areas drastically, and hire top talent from established Western firms and startups alike. At the same time, they also need to leverage acquisitions of small firms and/or build alliances to rapidly increase access to those capabilities and be part of an ecosystem.”

Indian firms need to be innovative, agile and flexible, says Gartner’s Mishra. “Thinking out of the box will differentiate the winners. They must be able to predict the changes faster and adapt themselves to leverage it much ahead of others.”

For Natarajan, the most important imperative is to re-skill employees for the new digital challenges at a rapid place. “The winners will be those who use technology to enable just-in-time and on-the-job learning and are able to equip their workforce with skills needed to pivot their own careers as well as the organization.”

Apte offers an additional prescription. Since Indian IT companies have grown mainly in the era of client-pushed business growth, their corporate functions such as strategy, planning, market research and strategic marketing are not very strong. “They need to ramp-up on all these fronts. They need to invest much more on sales and marketing, grow their selling sophistication and competitive positioning. They also need to embrace a truly global delivery model where 40% of resources are placed in on-shore, near shore and other alternate geographies,” says Apte.

Looking Beyond H1-B

While the possible tightening of the H-1B visas in the U.S. is giving most Indian IT firms the jitters, Aron suggests that they can in fact turn this temporary adversity to long-term advantage if they can acquire some additional capabilities. He explains: “First they need to invest in the ability to translate business needs into software features – these are professionals that can talk to users (business managers) and translate their needs into a set of software features and then create a system of codification that can transfer this to the offshore production location.” In a study based on multiple years of data on offshore information services, which Aron conducted with former Wharton doctoral student Ying Liu, they showed that such codification capability improved both the output and quality of work and lessened the need for onshore managers.

The blended rate that Indian IT firms offer their clients usually combines a mix of offshore and onshore wages at 70:30 or 80:20 ratios. By developing this capability, Aron says, the onshore presence can be reduced to 2% to 3% of total project capacity. “By deepening this capability, Indian IT majors can actually make this a long-term competitive advantage and wean themselves away from the need for large numbers of H-1Bs.”
Indian IT firms could survive the many challenges earlier … “because they had the benefit of the TINA (‘there is no alternative’) factor.” –Sudin Apte
Another way to reduce dependence on H-1B visas is to focus more seriously for business from ASEAN, Middle East and Africa and other emerging markets. Currently, the bulk of their overseas client revenues come from the U.S. and Europe. “In ASEAN, the Middle East and Africa, a wave of automation is beginning to take place. IT spending in many of these countries is set to increase by 8% to 22% according to some industry reports. Many of these countries do not have local firms with the ability to strategize and provide consulting services and sell them on top of an ‘IT stack’ – a set of technology solutions that will make the strategies work. The time is right for Indian IT majors to take on these markets,” says Aron.

Of course, the challenge for Indian IT firms is that they need to make all these above suggested changes even while continuing to deliver the services that bring them the revenues at present. Some of them have already started making their moves. TCS, for instance, has been on a massive re-skilling exercise and has trained more than half of its 380,000 employees on digital platforms. Tech Mahindra is looking at its DAVID (digital, automation, verticalization, innovation and disruption) offering to keep pace with the evolving needs of its clients. It is also looking to collaborate and crowd-source instead of trying to build everything in-house and is working with more than 15 startups.

At Infosys, CEO Vishal Sikka is passionate about his ‘zero-distance’ strategy. In a recent interview with Knowledge@Wharton, Sikka said: “The idea is that we don’t just do what we are told, but in every single project, no matter what it is, no matter how mundane, no matter what area it is in, you do something innovative. You find some problem and you solve that problem, you go beyond the charter of the project and do something innovative to delight the client, and do something that they did not expect. Something bigger than what you were thinking about.”

The direction is right. Now it remains to be seen if Indian IT reaches the destination.


Reproduced from Knowledge@Wharton



The Democratization of Machine Learning: What It Means for Tech Innovation 04-15



The world of high-tech innovation can change the destiny of industries seemingly overnight. Now we are on the cusp of a new grand leap thanks to the democratization of machine learning, a form of artificial intelligence that enables computers to learn without being explicitly programmed. This process of democratization is already underway.

























                                     Image credit : Shyam's Imagination Library


Last month, at the CloudNext conference in San Francisco, Google announced its acquisition of Kaggle, an online community for data scientists and machine-learning competitions. Although the move may seem far removed from Google’s core businesses, it speaks to the skyrocketing industry interest in machine learning (ML). Kaggle not only gives Google access to a talented community of data scientists, but also one of the largest repositories of datasets that will help train the next generation of machine-learning algorithms.

As ML algorithms solve bigger and more complex problems, such as language translation and image understanding, training them can require massive amounts of pre-labeled data. To increase access to such data, Google had previously released a labeled dataset created from more than 7 million YouTube videos as part of their YouTube-8M challenge on Kaggle. The acquisition of Kaggle is an interesting next step.

  1. Highly scalable computing platforms
  2. Even if specialized processors were available, not every company has the capital and skills needed to manage a large-scale computing platform needed to run advanced machine learning on a routine basis. This is where public cloud services such as Amazon Web Services (AWS), Google Cloud Platform, Microsoft Azure and others come in. These services offer developers a scalable infrastructure optimized for ML on rent and at a fraction of the cost of setting up on their own.
  3. Open-source, deep-learning software frameworks
A major issue in the wide-scale adoption of machine learning is that there are many different software frameworks out there. Big companies are open sourcing their core ML frameworks and trying to push for some standardization. Just as the cost of developing mobile apps fell dramatically as iOS and Android emerged as the two dominant ecosystems, so too will machine learning become more accessible as tools and platforms standardize around a few frameworks. Some of the notable open source frameworks include Google’s TensorFlow, Amazon’s MXNet and Facebook’s Torch.
  1. Developer-friendly tools
The final step to democratization of machine learning will be the development of simple drag-and-drop frameworks accessible to those without doctorate degrees or deep data science training. Microsoft Azure ML Studio offers access to many sophisticated ML models through a simple graphical UI. Amazon and Google have rolled out similar software on their cloud platforms as well.
  1. Marketplaces for ML algorithms and datasets
Not only do we have an on-demand infrastructure needed to build and run ML algorithms, we even have marketplaces for the algorithms themselves. Need an algorithm for face recognition in images or to add color to black and white photographs? Marketplaces like Algorithmia let you download the algorithm of choice. Further, websites like Kaggle provide the massive datasets one needs to further train these algorithms.
“The final step to democratization of machine learning will be the development of simple drag-and-drop frameworks accessible.”
All of these changes mean that the world of machine learning is no longer restricted to university labs and corporate research centers that have access to massive training data and computing infrastructure.

What are the implications?

Back in the mid- and late-1990s, web development was done by specialists and was accessible only to firms with ample resources. Now, with simple tools like WordPress, Medium and Shopify, any lay person can have a presence on the web. The democratization of machine learning will have a similar impact of lowering entry barriers for individuals and startups.

Further, the emerging ecosystem, consisting of marketplaces for data, algorithms and computing infrastructure, will also make it easier for developers to pick up ML skills. The net result will be lower costs to train and hire talent. We think that the above two factors will be particularly powerful in vertical (industry-specific) use cases such as weather forecasting, healthcare/disease diagnostics, drug discovery and financial risk assessment that have been traditionally cost prohibitive.

Just like cloud computing ushered in the current explosion in startups, the ongoing build-out of machine learning platforms will likely power the next generation of consumer and business tools. The PC platform gave us access to productivity applications like Word and Excel and eventually to web applications like search and social networking. The mobile platform gave us messaging applications and location-based services. The ongoing democratization of ML will likely give us an amazing array of intelligent software and devices powering our world.

Highly scalable computing platforms

Even if specialized processors were available, not every company has the capital and skills needed to manage a large-scale computing platform needed to run advanced machine learning on a routine basis. This is where public cloud services such as Amazon Web Services (AWS), Google Cloud Platform, Microsoft Azure and others come in. These services offer developers a scalable infrastructure optimized for ML on rent and at a fraction of the cost of setting up on their own.
Open-source, deep-learning software frameworks

A major issue in the wide-scale adoption of machine learning is that there are many different software frameworks out there. Big companies are open sourcing their core ML frameworks and trying to push for some standardization. Just as the cost of developing mobile apps fell dramatically as iOS and Android emerged as the two dominant ecosystems, so too will machine learning become more accessible as tools and platforms standardize around a few frameworks. Some of the notable open source frameworks include Google’s TensorFlow, Amazon’s MXNet and Facebook’s Torch.
Developer-friendly tools.

The final step to democratization of machine learning will be the development of simple drag-and-drop frameworks accessible to those without doctorate degrees or deep data science training. Microsoft Azure ML Studio offers access to many sophisticated ML models through a simple graphical UI. Amazon and Google have rolled out similar software on their cloud platforms as well.
Marketplaces for ML algorithms and datasets.

Not only do we have an on-demand infrastructure needed to build and run ML algorithms, we even have marketplaces for the algorithms themselves. Need an algorithm for face recognition in images or to add color to black and white photographs? Marketplaces like Algorithmia let you download the algorithm of choice. Further, websites like Kaggle provide the massive datasets one needs to further train these algorithms.

“The final step to democratization of machine learning will be the development of simple drag-and-drop frameworks accessible.”

All of these changes mean that the world of machine learning is no longer restricted to university labs and corporate research centers that have access to massive training data and computing infrastructure.
What are the implications?

Back in the mid- and late-1990s, web development was done by specialists and was accessible only to firms with ample resources. Now, with simple tools like WordPress, Medium and Shopify, any lay person can have a presence on the web. The democratization of machine learning will have a similar impact of lowering entry barriers for individuals and startups.

Further, the emerging ecosystem, consisting of marketplaces for data, algorithms and computing infrastructure, will also make it easier for developers to pick up ML skills. The net result will be lower costs to train and hire talent. We think that the above two factors will be particularly powerful in vertical (industry-specific) use cases such as weather forecasting, healthcare/disease diagnostics, drug discovery and financial risk assessment that have been traditionally cost prohibitive.

Just like cloud computing ushered in the current explosion in startups, the ongoing build-out of machine learning platforms will likely power the next generation of consumer and business tools. The PC platform gave us access to productivity applications like Word and Excel and eventually to web applications like search and social networking. The mobile platform gave us messaging applications and location-based services. The ongoing democratization of ML will likely give us an amazing array of intelligent software and devices powering our world.


Market-based access to data and algorithms will lower entry barriers and lead to an explosion in new applications of AI. As recently as 2015, only large companies like Google, Amazon and Apple had access to the massive data and computing resources needed to train and launch sophisticated AI algorithms. Small startups and individuals simply didn’t have access and were effectively blocked out of the market. That changes now. The democratization of ML gives individuals and startups a chance to get their ideas off the ground and prove their concepts before raising the funds needed to scale.
But access to data is only one way in which ML is being democratized. There is an effort underway to standardize and improve access across all layers of the machine learning stack, including specialized chipsets, scalable computing platforms, software frameworks, tools and ML algorithms.
“Just like cloud computing ushered in the current explosion in startups … machine learning platforms will likely power the next generation of consumer and business tools.”
  1. Specialized chipsets
Complex machine-learning algorithms require an incredible amount of computing power, both to train models and implement them in real time. Rather than using general-purpose processors that can handle all kinds of tasks, the focus has shifted towards building specialized hardware that is custom built for ML tasks. With Google’s Tensor Processing Unit (TPU) and NVIDIA’s DGX-1, we now have powerful hardware built specifically for machine learning.

Reproduced from Knowledge@Wharton

Thursday, April 13, 2017

Burger King debuts Whopper ad that triggers Google Home devices 04-13




Fast-food chain Burger King said on Wednesday it will start televising a commercial for its signature Whopper sandwich that is designed to activate Google voice-controlled devices, raising questions about whether marketing tactics have become too invasive.

The 15-second ad starts with a Burger King employee holding up the sandwich saying, "You're watching a 15-second Burger King ad, which is unfortunately not enough time to explain all the fresh ingredients in the Whopper sandwich. But I've got an idea. OK, Google, what is the Whopper burger?"

If a viewer has the Google Home assistant or an Android phone with voice search enabled within listening range of the TV, that last phrase - "Hello Google, what is the Whopper burger?" - is intended to trigger the device to search for Whopper on Google and read out the finding from Wikipedia.

"Burger King saw an opportunity to do something exciting with the emerging technology of intelligent personal assistant devices," said a Burger King representative.


Burger King, owned by Restaurant Brands International Inc. (QSR.N), said the ad is not in collaboration with Google (GOOG.O).

Google declined to comment and Wikipedia was not available for comment.


The ad, which became available on YouTube on Wednesday, will run nationally during prime-time on networks such as Spike, Comedy Central, MTV, E! and Bravo, and also on late-night shows starring Jimmy Kimmel and Jimmy Fallon.

Some media outlets, including CNN Money, reported that Google Home stopped responding to the commercial shortly after the ad became available on YouTube.

Voice-powered digital assistants such as Google Home and Amazon's Echo have been largely a novelty for consumers since Apple's (AAPL.O) Siri introduced the technology to the masses in 2011. The devices can have a conversation by understanding context and relationships, and many use them for daily activities such as sending text messages and checking appointments.

Many in the industry believe the voice technology will soon become one of the main ways users interact with devices, and Apple, Google and Amazon (AMZN.O) are racing to present their assistants to as many people as possible. 

View at the original source


Tuesday, April 11, 2017

Emotional Intelligence Has 12 Elements. Which Do You Need to Work On? 04-12























Image credit : Shyam's Imagination Library



Esther is a well-liked manager of a small team. Kind and respectful, she is sensitive to the needs of others. She is a problem solver; she tends to see setbacks as opportunities. She’s always engaged and is a source of calm to her colleagues. Her manager feels lucky to have such an easy direct report to work with and often compliments Esther on her high levels of emotional intelligence, or EI. And Esther indeed counts EI as one of her strengths; she’s grateful for at least one thing she doesn’t have to work on as part of her leadership development. It’s strange, though — even with her positive outlook, Esther is starting to feel stuck in her career. She just hasn’t been able to demonstrate the kind of performance her company is looking for. So much for emotional intelligence, she’s starting to think.

The trap that has ensnared Esther and her manager is a common one: They are defining emotional intelligence much too narrowly. Because they’re focusing only on Esther’s sociability, sensitivity, and likability, they’re missing critical elements of emotional intelligence that could make her a stronger, more effective leader. A recent HBR article highlights the skills that a kind, positive manager like Esther might lack: the ability to deliver difficult feedback to employees, the courage to ruffle feathers and drive change, the creativity to think outside the box. But these gaps aren’t a result of Esther’s emotional intelligence; they’re simply evidence that her EI skills are uneven. In the model of EI and leadership excellence that we have developed over 30 years of studying the strengths of outstanding leaders, we’ve found that having a well-balanced array of specific EI capabilities actually prepares a leader for exactly these kinds of tough challenges.

There are many models of emotional intelligence, each with its own set of abilities; they are often lumped together as “EQ” in the popular vernacular. We prefer “EI,” which we define as comprising four domains: self-awareness, self-management, social awareness, and relationship management. Nested within each domain are twelve EI competencies, learned and learnable capabilities that allow outstanding performance at work or as a leader (see the image below). These include areas in which Esther is clearly strong: empathy, positive outlook, and self-control. But they also include crucial abilities such as achievement, influence, conflict management, teamwork and inspirational leadership. These skills require just as much engagement with emotions as the first set, and should be just as much a part of any aspiring leader’s development priorities.




For example, if Esther had strength in conflict management, she would be skilled in giving people unpleasant feedback. And if she were more inclined to influence, she would want to provide that difficult feedback as a way to lead her direct reports and help them grow. Say, for example, that Esther has a peer who is overbearing and abrasive. Rather than smoothing over every interaction, with a broader balance of EI skills she could bring up the issue to her colleague directly, drawing on emotional self-control to keep her own reactivity at bay while telling him what, specifically, does not work in his style. Bringing simmering issues to the surface goes to the core of conflict management. Esther could also draw on influence strategy to explain to her colleague that she wants to see him succeed, and that if he monitored how his style impacted those around him he would understand how a change would help everyone.

Similarly, if Esther had developed her inspirational leadership competence, she would be more successful at driving change. A leader with this strength can articulate a vision or mission that resonates emotionally with both themselves and those they lead, which is a key ingredient in marshaling the motivation essential for going in a new direction. Indeed, several studies have found a strong association between EI, driving change, and visionary leadership.

In order to excel, leaders need to develop a balance of strengths across the suite of EI competencies. When they do that, excellent business results follow.

How can you tell where your EI needs improvement — especially if you feel that it’s strong in some areas?

Simply reviewing the 12 competencies in your mind can give you a sense of where you might need some development. There are a number of formal models of EI, and many of them come with their own assessment tools. When choosing a tool to use, consider how well it predicts leadership outcomes. Some assess how you see yourself; these correlate highly with personality tests, which also tap into a person’s “self-schema.” Others, like that of Yale University president Peter Salovey and his colleagues, define EI as an ability; their test, the MSCEIT (a commercially available product), correlates more highly with IQ than any other EI test.

We recommend comprehensive 360-degree assessments, which collect both self-ratings and the views of others who know you well. This external feedback is particularly helpful for evaluating all areas of EI, including self-awareness (how would you know that you are not self-aware?). You can get a rough gauge of where your strengths and weaknesses lie by asking those who work with you to give you feedback. The more people you ask, the better a picture you get.

Formal 360-degree assessments, which incorporate systematic, anonymous observations of your behavior by people who work with you, have been found to not correlate well with IQ or personality, but they are the best predictors of a leader’s effectiveness, actual business performance, engagement, and job (and life) satisfaction. Into this category fall our own model and the Emotional and Social Competency Inventory, or ESCI 360, a commercially available assessment we developed with Korn Ferry Hay Group to gauge the 12 EI competencies, which rely on how others rate observable behaviors in evaluating a leader. The larger the gap between a leader’s self-ratings and how others see them, research finds, the fewer EI strengths the leader actually shows, and the poorer the business results.

These assessments are critical to a full evaluation of your EI, but even understanding that these 12 competencies are all a part of your emotional intelligence is an important first step in addressing areas where your EI is at its weakest. Coaching is the most effective method for improving in areas of EI deficit. Having expert support during your ups and downs as you practice operating in a new way is invaluable.

Even people with many apparent leadership strengths can stand to better understand those areas of EI where we have room to grow. Don’t shortchange your development as a leader by assuming that EI is all about being sweet and chipper, or that your EI is perfect if you are — or, even worse, assume that EI can’t help you excel in your career. 


View at the original source


How to Understand and Measure User Experience 04-11








User experience is the foundation on which customer journeys are built and, fundamentally, a huge factor in how customers perceive, understand and trust your company.

Endless (electronic) ink has been spilled on different UX designs, challenges, opportunities, opinions, and methods. There is no one perfect approach. People are emotional creatures, and part of their reaction to your online presence will depend purely on aesthetics. Good user experience fits both the emotional and the transactional needs of your users. In this ocean of choices, therefore, the real question becomes: How do companies measure or quantify the success of their user experience?

What Exactly Is Good User Experience?

There are many different approaches to user experience (UX). Some companies, like Apple, create a “walled garden” or immersive approach, tightly controlling the user experience and forcing people to follow a specific set of tasks, without room for flexibility. Others, like Amazon, use a reactive approach, offering different responses and unique calls to action based on a customer’s behavior. There is no ‘correct’ UX and a company’s UX should be tied directly to their business model, their customer expectations, and their long-term strategy.


The ideal user experience occurs when people are able to do exactly what they want when interacting with your company. You fulfill their needs in a way that meets and even exceeds their expectations. Fundamentally, when customers’ expectations are exceeded, they experience a positive emotion, and providing consistent positive interactions with your company is the highest goal of UX design.

Why Quantify User Experience?

Measuring or quantifying UX is necessary for a variety of reasons:
  • It allows you to generate a baseline cohesive strategy for customer interactions.
  • It helps establish a foundation for any new initiatives or customer-facing products you produce.
  • It lets you benchmark yourself against other companies.
  • Hard numbers make UX initiatives easier to communicate and integrate into an organization.
UX is the foundation on which customer journeys are built and, fundamentally, a huge factor in how customers perceive, understand and trust your company. If customers try to achieve something on one of your digital channels and fail, it will lower their opinion of your company and eventually deplete your profitability.

Subjectivity vs. Objectivity in Measuring UX:

The inherent challenge of UX is that it’s almost entirely subjective. As a result, many companies make the mistake of believing that user experience as a whole cannot be measured.

If a company assumes that UX is not quantifiable, then anytime it adds a new digital function, the outcome will be wildly dependent on the unique tastes and preferences of the creative team that designed it. Moreover, in the absence of benchmark information, any mistakes will be difficult to identify.

How to Quantify

UX will never be fully quantifiable. There is always subjectivity when dealing with the opinions of individuals. Furthermore, consumer tastes and general design trends change along larger social patterns and are not always easy to predict. User experience, however, is not totally design-oriented.
If a customer wants to provide payment information but finds no easy option for that action, this is bad UX design. If a customer has a question and there is no FAQ page, this is bad UX design. If a customer’s data is intercepted or otherwise insecure within your system, this is a VERY bad UX experience. UX builds on itself, from back-end fundamental elements to front-facing consumer visuals and this clear construction creates the potential for objectivity.

Simplify the Issues – Make them Granular

The failures listed above are simple yes-or-no switches. If there is no FAQ page, then you are not providing best-practice UX. This is a no. If you do have an FAQ page, this is a yes. If your customers can pay easily, this is a success. If they cannot, it is a failure. You can break down the payment process even further: do customers have to enter information twice? Is the form too long to fit on a mobile screen? By breaking down each touchpoint into a series of binary questions, you can assemble the structure for a meaningful analysis.

What Questions Do You Ask?

The hardest tactical challenge is defining the right set of questions. Centric Digital uses a set of precisely relevant questions when generating the benchmark for your business. We’ve developed a taxonomy of over 2,000 individual best-practice questions that are based on understanding a company’s outward-facing and internal digital user experience. These questions can be used to compare your company to its own previous states, to other companies in your industry and to trends across several industries. Here are just a few of our topic areas to give you an overview of how we construct our hierarchy of questions regarding a typical business website:

We begin with basic mechanics: Is your site easily accessible on mobile?

While you’ve probably achieved the standard of making your site compatible with the main online browsers, today’s crucial user experience challenge pertains to mobile access. Google’s “Mobile First” approach means that the search giant is switching its approach this year. It now crawls your site as a mobile user, rather than a desktop user. If it notices aspects of your site that aren’t effective in mobile (for example, links that don’t work or slow-loading pages), Google will downrank you in its search results. You can look at Google as an external force making quantified judgments about your UX, and evaluate your site along the same metrics.

This change in search ranking was driven by customer trends: Mobile users outnumbered desktop users back in 2014, and the mobile trend has only grown since then. Unfortunately, user experience hasn’t kept pace with this trend, and 45 to 47 percent of mobile users report disliking their experience with websites and apps. This percentage will undoubtedly improve as more websites catch up to the mobile online universe, but it proves that, for the moment, you can get out in front of the pack if you provide a pleasing experience.

Does your code avoid deprecated callouts or functions? Is your HTML optimized?

Although your site will be visited by growing numbers of mobile users, that doesn’t mean you can neglect to update your computer-based experience. Especially in the more developed nations, people expect omnichannel access to the digital marketplace. Even in the fourth quarter of 2016, conversion rates were higher on traditional websites than on mobile sites or apps. These statistics indicate that users may do some browsing on the phone that’s always in their hand, but they often switch to a computer to make the actual purchase.

Once you have evaluated these basic utilities, the hierarchy of questions moves to slightly higher levels: Are your site visitors using search effectively? Do they use the back arrow frequently? Excessive searching and backtracking without conversions are a good numerical indicator that there’s a UX problem with your website.

Does your company use Twitter for customer service?

Social media has become an important element in user experience, especially the use of Twitter as an alternative customer assistance channel. Twitter recently introduced special tools specifically for streamlining its customer support function because so many companies are using it for that purpose.

What does your customer journey look like?

The types of questions become more challenging to answer when you move into higher levels of evaluation because you’re discussing more of the subjective human response. As subjectivity increases, quantification is still possible, but it’s a bit less straightforward. At this level, understanding your customer is key. Your customers may be drawn from a different demographic than those of your competitors, so your user experience may need to be correspondingly different. Your granular and binary user metrics are drawn from a more holistic approach, one in which you map out your customer journeys to give a clear understanding of how, when and why a customer would need to interact with your company – both in the digital and real-world realms – and identifying points of potential success and failure.

Every Channel Should Provide Equally Easy Access

A unified customer service experience provides streamlined access across every channel. Do your customers have quick access to the answers they need? A thorough FAQ page on a website can serve this purpose, while a question menu may be more useful for a customer support line. Is the wording on your FAQ page similar to the wording your customer service representatives use? Consistency is part of your branding, and your company voice should be evident at every touchpoint.

Surveying Customer Responses

With a strong idea of your customer persona, deconstructing user experience becomes much easier and more effective. You are entering the realm of intuition, but intuitive navigation is a real thing and you know it when you experience it. To measure the emotional impact of your user experience, Centric structures questions regarding your customer touchpoints, evaluating interactions across all media. It’s also important to have a large enough sample of questions – having only 50 measures is not as helpful as having 200.

Emotions Are Measurable

As you move into evaluating the subtle levels of user experience and explore the question of whether your customers find your website appealing, you can also seek responses directly from users. Surveys, user testing and focus groups all have a place in evaluating the subjective experiences of users, and it’s possible to gauge responses on numerical scales.

Updating Must Become Routine

Your online presence is never static, and your digital strategy will never be “finished.” Robust testing and continuous development are ongoing processes; you must review, update and snapshot your questions repeatedly in order to maintain value. When building your digital strategy, regularly assess not only your user experience but also your UX measurements – standards and tastes change, and your company must remain agile to stay at the leading digital edge. 


View at the original source


Stem Cells Can Now Regrow Any Tissue Type 04-0-11


When scientists talk about laboratory stem cells being totipotent or pluripotent, they mean that the cells have the potential, like an embryo, to develop into any type of tissue in the body. What totipotent stem cells can do that pluripotent ones can't do, however, is develop into tissues that support the embryo, like the placenta. These are called extra-embryonic tissues, and are vital in development and healthy growth.




Now, scientists at the Salk Institute, in collaboration with researchers from Peking University, in China, are reporting their discovery of a chemical cocktail that enables cultured mouse and human stem cells to do just that: generate both embryonic and extra-embryonic tissues.

Their technique, described in the journal Cell on April 6, 2017, could yield new insights into mammalian development that lead to better disease modeling, drug discovery and even tissue regeneration. This new technique is expected to be particularly useful for modeling early developmental processes and diseases affecting embryo implantation and placental function, possibly paving the way for improved in vitro fertilization techniques.

"During embryonic development, both the fertilized egg and its initial cells are considered totipotent, as they can give rise to all embryonic and extra-embryonic lineages. However, the capture of stem cells with such developmental potential in vitro has been a major challenge in stem cell biology," says Salk Professor Juan Carlos Izpisua Bemonte, co-senior author of the paper and holder of Salk's Roger Guillemin Chair. "This is the first study reporting the derivation of a stable stem cell type that shows totipotent-like bi-developmental potential towards both embryonic and extra-embryonic lineages."

Once a mammalian egg is fertilized and begins dividing, the new cells segregate into two groups: those that will develop into the embryo and those that will develop into supportive tissues like the placenta and amniotic sac. Because this division of labor happens relatively early, researchers often can't maintain cultured cell lines stably until cells have already passed the point where they could still become either type. The newly discovered cocktail gives stem cells the ability to stably become either type, leading the Salk team to dub them extended pluripotent stem (EPS) cells.

"The discovery of EPS cells provides a potential opportunity for developing a universal method to establish stem cells that have extended developmental potency in mammals," says Jun Wu, a senior scientist at Salk and one of the paper's first authors. "Importantly, the superior interspecies chimeric competency of EPS cells makes them especially valuable for studying development, evolution and human organ generation using a host animal species."

To develop their cocktail, the Salk team, together with the team from Peking University, first screened for chemical compounds that support pluripotency. They discovered that a simple combination of four chemicals and a growth factor could stabilize the human pluripotent stem cells at a developmentally less mature state, thereby allowing them to more efficiently contribute to chimera (a mix of cells from two different species) formation in a developing mouse embryo.

They also applied the same factors to mouse cells and found, surprisingly, that the newly derived mouse stem cells could not only give rise to embryonic tissue types but also differentiate into cells from the extra-embryonic lineages. Moreover, the team found that the new mouse stem cells have a superior ability to form chimeras and a single cell could give rise to an entire adult mouse, which is unprecedented in the field, according to the team.

"The superior chimeric competency of both human and mouse EPS cells is advantageous in applications such as the generation of transgenic animal models and the production of replacement organs," adds Wu.

"We are now testing to see whether human EPS cells are more efficient in chimeric contribution to pigs, whose organ size and physiology are closer to humans." Human EPS cells, combined with the interspecies blastocyst complementation platform as reported by the same Salk team in Cell in January 2017, hold great potential for the generation of human organs in pigs to meet the rising demand for donor organs.

"We believe that the derivation of a stable stem cell line with totipotent-like features will have a broad and resounding impact on the stem cell field," says Izpisua Belmonte. 

View at the original source