What To Expect During An HR Interview? Five Questions You’ll Be Asked During A Screening Interview – by Melissa Llarena

Screening interviews with human resources professionals are a crucial step to getting the job. A good or bad interview with HR will determine how far you go in the interviewing process, so it’s best to know what to expect and go in prepared.

As a career coach, I have worked with job candidates on how to answer the most common questions asked by HR. My mock interviews place clients in situations similar to ones they will actually face and prepare them to ace their interviews and land the job.

Let’s take a look at the five most common questions asked by HR during screening interviews and how you should approach them.

1. Why are you interested in this position?

HR professionals love this question so use it as your chance to reiterate your strengths and highlight your applicable skill set and passion for the company and the role. Speak to how your past experiences match the qualifications for the job using keywords from the job description to make the connection stronger. By clearly linking your skills to the position, you are helping the HR manager envision you in the role.

Sample Answer: Having worked within the financial services sector for five years, I have gained an appreciation for the power of client-facing roles in terms of my professional development and organizational impact. As a relationship manager in your firm, I envision bringing my ability to resourcefully optimize any given client’s portfolio as the best way I can help your company’s five-year strategic goal of retaining its client base at a 50% rate. I have done this in the past in X capacity while working for my previous employer and am confident that I can help you accelerate your current goals while growing my career.

2. Tell me about yourself.

As an age-old prompt that will likely never go away, it’s important to know how to provide a compelling answer for an HR manager. Instead of the typical chronological progression of your background, I recommend doing a SWOT analysis within the context of a professional interview. Analyze the sector, the company, and the job function using a SWOT and look for opportunities to market yourself. I go into this in more detail in my blog post on how to tell your professional story in a way that will entice an interviewer to hire you.

Sample Answer: I have been a sales manager for X years, with experiences that include being able to lead a sales force toward the accomplishment of aggressive goals. In light of your organization’s core strength in hiring the brightest salespeople, I would know exactly how to coach them to sell both new and legacy products in new markets quickly. While at Company X, I created the gold standard incentive program that resulted in helping us sell-in potential charge volume that exceeded our goals by 20% in both travel expenses and daily expenses. Prior to that, I worked at X where I completed X, etc. Side note: figure out the assets of the hiring firm or its needs and tailor your response accordingly.

3. Why are you leaving your current job?

HR managers ask this question to determine if there are any red flags related to your departure. Are you leaving on good terms or bad? Are you looking to escape from your current job or grow within a new one? These are a few of the questions running through the interviewer’s mind. Take this opportunity to speak positively of your current employer but communicate that you’re looking at this new position as the next step in your career. By framing your answer positively, you’re making the interviewer focus on your potential contributions rather than any red flags.

Sample Answer: My business unit started with 50 full-time employees and today it has 10. While this reduction in personnel enabled me to showcase my ability to produce results with limited resources in an organization where management has turned over, I am interested in transitioning to an organization like yours where there is growth potential. For example, in my current role I managed to acquire 100K clients with only one other sales manager and a dwindling budget. In your company, I would be managing a team of 20 sales managers, where I stand to make a significant impact not only for your firm but on the firm’s market share.

4. What do you know about the company?

This is a test and one you should be able to pass easily. Doing research on a company prior to an interview is a necessity. You need to know the history and makeup of the company, who the key players are, recent accomplishments and mentions in the press, and any other relevant information. Communicate the positive information you learned about the company, from awards to new product launches, to demonstrate your knowledge.

Sample Answer: Your firm competes with firm A, firm B, and firm C in the U.S. My understanding is that you are better positioned in this area vs. firms A, B, or C. Meanwhile, firms B and C bring these strengths to the table. Given my skill set, I know that I can help you optimize your strength in this and offset the strengths that firms B and C plan to invest more heavily in during 2014. *Side note: the point is to be specific in how you’d use this information to drive results.

5. What questions do you have for me?

ALWAYS have questions for the interviewer. The strongest candidates show their enthusiasm and position themselves as potentially valuable team members by asking smart, strategic questions that benefit both the interviewer and the interviewee. If you’re stumped, here are five questions to ask HR that will take you to the next phase of the interviewing process.

“To learn more about how to navigate job interviews or if you have an upcoming interview, set up a 15-minute consultation.  I have helped professionals go from second choice to first.”

Melissa Llarena is a firsthand career transition expert (having gone through 16 business unit changes in 10 years) and president of Career Outcomes Matter.  Sign up for her blog at www.careeroutcomesmatter.com.

How to separate learning myths from reality – by Artin Atabaki, Stacey Dietsch, and Julia M. Sperling

Over the years, you have probably gained some insight into how your brain works. You may have taken a course or read a book that promised to reveal the secret of maximizing your mental capacity—a common sales pitch of leadership coaches these days. In the process, you may have read that after a critical period in childhood there is no hope for significant learning, that half of your brain is inactive at any given time, or that you’re capable of learning properly only in your preferred style.

Each of these claims is what we call a “neuromyth,” a misconception based on incorrect interpretations of neuroscientific research. Our experience advising companies on their lifelong-learning initiatives suggests that such misunderstandings remain embedded in many corporate training programs. As companies increasingly pour money into developing their employees, they can no longer afford to invest in training programs based on inaccurate and out-of-date assumptions. In recent years, for example, US businesses alone spent more than $164 billion annually on employee learning.1 The stakes are high and getting higher.

Bridging the gap between popular neuromyths and the scientific insights gathered in the past few decades is a growing challenge. As modern brain-imaging techniques, such as functional magnetic resonance imaging (fMRI), have advanced scientific knowledge, these misleading lay interpretations by business practitioners have advanced as well. Unless such misconceptions are eliminated, they will continue to undermine both personal- and organizational-learning efforts. In this article, we’ll address the three most prominent neuromyths in light of the latest research and explore some of the implications for corporate learning.

Myth #1: The critical window of childhood

Most of us have heard about critical learning periods—the first years of life, when the vast majority of the brain’s development is thought to occur. After this period, or so the assumption too often goes, the trajectory of human development is deemed to be more or less fixed. That, however, is an exaggeration. Recent neuroscientific research indicates that experience can change both the brain’s physical structure and its functional organization—a phenomenon described as neuroplasticity.

Researchers studying the plasticity of the brain are increasingly interested in mindfulness. Practicing simple meditation techniques, such as concentrated breathing, helps build denser gray matter in parts of the brain associated with learning and memory, controlling emotions, and compassion. A team led by Harvard scientists has shown that just eight weeks of mindful meditation can produce structural brain changes significant enough to be picked up by MRI scanners.2

Organizations from General Mills in consumer foods to digital bellwethers such as Facebook and Google increasingly give their employees opportunities to benefit from mindfulness and meditation. Most such programs have garnered enthusiastic support from employees, who often see a marked improvement in their mind-sets and job performance. For example, employees at the health insurer Aetna who have participated in the company’s free yoga and meditation classes report, on average, a 28 percent decrease in their levels of stress and a productivity increase of 62 minutes a week—an added value of approximately $3,000 per employee a year. CEO Mark Bertolini, who started the program a few years ago, marvels at the level of interest generated across the company; to date, more than a quarter of Aetna’s 50,000 employees have taken at least one class.3 Leaders like Bertolini understand that providing them with the tools to become more focused and mindful can foster a better working environment conducive to development and high performance.

Myth #2: The idle-brain theory

A recent European survey discovered that nearly 50 percent of teachers surveyed in the United Kingdom and the Netherlands believed that the idle-brain theory has been proved scientifically.4 This misunderstanding originally stemmed from inaccurate interpretations of activation hot spots in brain-imaging studies. By now, more carefully interpreted functional brain scans have shown that, irrespective of what a person is doing, the entire brain is generally active and that, depending on the task, some areas are more active than others. People can always learn new ideas and new skills, not by tapping into some unused part of the brain, but by forming new or stronger connections between nerve cells.

This insight into the brain’s capacity becomes particularly relevant for the environment and context in which learning typically occurs. Everybody knows, all too well, about the habit of quickly checking e-mails or planning for the next meeting in the middle of a training session. The problem is that such multitasking engages large parts of the brain’s working memory. Without freeing that up, we cannot successfully memorize and learn new information. In short, multitasking and learning cannot occur effectively at the same time.

Some organizations, recognizing this problem, are working to build immersive learning environments where distractions are eliminated. At McKinsey, we’ve created a model factory that participants can walk through to see operating conditions in action. But first, everyone is asked to place their phones and other distractive belongings in a locker, so they can fully concentrate on the learning exercise at hand. At many companies, removing the temptation of using mobile devices during learning sessions is becoming commonplace.

Myth #3: Learning styles and the left/right brain hypothesis

Almost everyone has encountered the theory that most people are either dominantly analytical (and left brained) or more creative (and right brained). However, this either/or dichotomy is false. The two hemispheres of the brain are linked and communicate extensively together; they do not work in isolation. The simplistic notion of a false binary has led, in many businesses, to the misconception that each one of us has a strictly preferred learning style and channel. Recent studies have flatly disproved this idea, suggesting instead that engaging all the senses in a variety of ways (for instance, audiovisual and tactile) can help employees retain new content.

One organization that puts this idea into practice is KFC, which uses multiple forms of learning in customer-service training. Sessions begin with an after-hours board game placing the entire team of a store in the role of the customer. This is followed up by “gamified” learning that fits into roughly 15-minute windows during shifts. These video game–like modules put the employees behind the cash register to handle a number of typical customer experiences, including responding to audio and visual cues of satisfaction. At the end of the online modules, employees physically reconvene at the front of the store to hear feedback, report on what they’ve learned, and receive live coaching as reinforcement.

Although significant progress has been made, much remains to be done to eradicate neuromyths from the philosophy of corporate-training programs. Neuroscience research has confirmed some of the approaches that learning professionals already use, such as on-the-job reinforcement and engagement without distractions. But that research has also contradicted other approaches. Companies should draw on the newly substantiated insights and may need to rethink their training programs accordingly. At the very least, they need to improve their dialogue with, and understanding of, the scientific community.

About the authors

Artin Atabaki is a consultant in McKinsey’s Stuttgart office; Stacey Dietsch is an associate principal in the Washington, DC, office; and Julia M. Sperling is a principal in the Dubai office.

The authors wish to thank McKinsey’s Jennifer May, Michael Rennie, and Kristina Wollschlaeger for their support of and contributions to this article.

The Water Cooler Runs Dry – by Frank Bruni

If you’re closing in on 50 but want to feel much, much older, teach a college course. I’m doing that now, at 49, and hardly a class goes by when I don’t make an allusion that prompts my students to stare at me as if I just dropped in from the Paleozoic era.

Last week I mentioned the movie “They Shoot Horses, Don’t They?” Only one of the 16 students had heard of it. I summarized its significance, riffling through the Depression, with which they were familiar, and Jane Fonda’s career, with which they weren’t. “Barbarella” went sailing over their heads. I didn’t dare test my luck with talk of leg warmers and Ted Turner.

I once brought up Vanessa Redgrave. Blank stares. Greta Garbo. Ditto. We were a few minutes into a discussion of an essay that repeatedly invoked Proust’s madeleine when I realized that almost none of the students understood what the madeleine signified or, for that matter, who this Proust fellow was.

And these are young women and men bright and diligent enough to have gained admission to Princeton University, which is where our disconnect is playing out.

The bulk of that disconnect, obviously, is generational. Seemingly all of my students know who Gwyneth Paltrow is. And with another decade or two of reading and living and being subjected to fossils like me, they’ll assemble a richer inventory of knowledge and trivia, not all of it present-day.

But the pronounced narrowness of the cultural terrain that they and I share — the precise limits of the overlap — suggests something additional at work. In a wired world with hundreds of television channels, countless byways in cyberspace and all sorts of technological advances that permit each of us to customize his or her diet of entertainment and information, are common points of reference dwindling? Has the personal niche supplanted the public square?

Both literally and figuratively, the so-called water-cooler show is fading fast, a reality underscored by a fact that I stumbled across in last week’s edition of The New Yorker: In the mid-1970s, when the sitcom “All in the Family” was America’s top-rated television series, more than 50 million people would tune in to a given episode. That was in a country of about 215 million.

I checked on the No. 1 series for the 2012-13 television season. It was “NCIS,” an episode of which typically drew fewer than 22 million people, even counting those who watched a recording of it within a week of its broadcast. That’s out of nearly 318 million Americans now.

“NCIS” competes against an unprecedented bounty of original programming and more ways to see new and old shows than ever, what with cable networks, subscription services, YouTube, Apple TV and Aereo. Yahoo just announced that it was jumping into the fray and, like Netflix and Amazon, would develop its own shows.

In movies, there’s a bevy of boutique fare that never even opens in theaters but that you can order on demand at home. In music, streaming services and Internet and satellite radio stations showcase a dizzying array of songs and performers, few of whom attain widespread recognition. In books, self-publishing has contributed to a marked rise in the number of titles, but it doesn’t take an especially large crowd of readers for a book to become a best seller. Everyone’s on a different page.

With so very much to choose from, a person can stick to one or two preferred micro-genres and subsist entirely on them, while other people gorge on a completely different set of ingredients. You like “Housewives”? Savor them in multiple cities and accents. Food porn? Stuff yourself silly. Vampire fiction? The vein never runs dry.

I brought up this Balkanization of experience with Hendrik Hartog, the director of the American studies program at Princeton, and he noted that what’s happening in popular culture mirrors what has transpired at many elite universities, where survey courses in literature and history have given way to meditations on more focused themes.

“There’s enormous weight given to specialized knowledge,” he said. “It leaves an absence of connective tissue for students.” Not for nothing, he observed, does his Princeton colleague Daniel Rodgers, an emeritus professor of history, call this the “age of fracture.”

It has enormous upsides, and may be for the best. No single, potentially alienating cultural dogma holds sway. A person can find an individual lens and language through which his or her world comes alive.

And because makers of commercial entertainment don’t have to chase an increasingly apocryphal mass audience, they can produce cultish gems, like “Girls” on HBO and “Louie” on FX.

But each fosters a separate dialect. Finding a collective vocabulary becomes harder. Although I’m tempted to tell my students that they make me feel like the 2,000-year-old man, I won’t. I might have to fill them in first on Mel Brooks.

How Virtual Humans Can Build Better Leaders – by Randall W. Hill, Jr

 

20140728_2The aviation industry has long relied on flight simulators to train pilots to handle challenging situations. These simulations are an effective way for pilots to learn from virtual experiences that would be costly and difficult or dangerous to provide in the real world.

And yet in business, leaders commonly find themselves in tricky situations for which they haven’t trained. From conducting performance reviews to negotiating with peers, they need practice to help navigate the interpersonal dynamics that come into play in interactions where emotions run high and mistakes can result in lost deals, damaged relationships, or even harm to their — or their company’s — reputation.

Some companies, particularly those with substantial resources, do use live-role playing in management and other training. But this training is expensive and limited by time and availability constraints, and lack of consistency. Advances in artificial intelligence and computer graphics are now enabling the equivalent of flight simulators for social skills – simulators that have the potential to overcome these problems. These simulations can provide realistic previews of what leaders might encounter on the job, engaging role-play interactions, and constructive performance feedback for one-on-one conversations or complex dynamics involving multiple groups or departments.

Over the past fifteen years, our U.S. Army-funded research institute has been advancing both the art and science behind virtual human role players, computer generated characters that look and act like real people, and social simulations — computer models of individual and group behavior. Thousands of service men and women are now getting virtual reality and video game-based instruction and practice in how to counsel fellow soldiers, how to conduct cross-cultural negotiations and even in how to anticipate how decisions will be received by different groups across, and outside of, an organization. Other efforts provide virtual human role players to help train law students in interviewing child witnesses, budding clinicians in how to improve their diagnostic skills and bedside manner, and young adults on the autism spectrum disorders in how to answer questions in a job interview.

Our research is exploring how to build resilience by taking people through stressful virtual situations, like the loss of a comrade, child or leader, before they face them in reality. We are also developing virtual humans that can detect a person’s non-verbal behaviors and react and respond accordingly. Automated content creation tools allow for customized scenarios and new software and off-the-shelf hardware are making it possible to create virtual humans modeled on any particular person. It could be you, your boss, or a competitor.

Imagine facing a virtual version of the person you have to lay off. Might you treat him or her differently than a generic character? What if months of preparation for an international meeting went awry just because you declined a cup of tea? Wouldn’t you wish you’d practiced for that? If a virtual audience programmed to react based on your speaking style falls asleep during your speech, I’d be surprised if you didn’t you pep up your presentation before facing a real crowd.

It is still early days in our virtual-human development work, but the results are promising. An evaluation of ELITE (emergent leader immersive training environment), the performance review training system we developed for junior and noncommissioned officers, found that students showed an increase in retention and application of knowledge, an increase in confidence using the skills, and awareness of the importance of interpersonal communication skills for leadership.

A related study showed that subjects found the virtual human interaction as engaging and compelling as the same interaction with a live human role-player. I can say from personal experience that asking questions of the students in a virtual classroom can be exhilarating (and unnerving when the virtual student acts just like a “real” student, slouching in boredom and mumbling an answer). Unlike a live human actor, however, a virtual human does not need to be paid, can work anytime, and can be consistent with all students, or take a varied approach if needed. Virtual human systems can have the added advantage of built-in assessment tools to track and evaluate a performance.

Technology alone is not the answer, of course As I recently wrote in “Virtual Reality and Leadership Development,” a chapter of the book Using Experience to Develop Leadership Talent, virtual humans and video game-based systems are only as effective as the people who program them. No matter how convincing a virtual human is, it’s just an interface. If the instructional design behind it is flawed it won’t be effective. So we focus as intensively on what a virtual human is designed to teach, how learning will occur, and how to continuously improve its performance as on the technology itself.

I believe simulation technologies are going to change the way we educate and train the workforce, particularly in the area of social skills. In time, just as a pilot shouldn’t fly without practicing in a simulator first, managers and leaders will routinely practice with virtual humans for the challenging situation they’re sure to encounter.

80-Orli-BelmanRandall W. Hill, Jr., is the executive director of the University of Southern California Institute for Creative Technologies and an expert in how virtual reality and video games can be used to develop effective learning experiences. He is also a research professor of computer science at USC

This column will change your life: interestingness v truth – by Oliver Burkeman

‘Even in the world of academia, most people aren’t motivated by the truth. What they want, above all, is not to be bored’

 

This column will change your life: interestingness v truth

‘If you care about the truth, interestingness can mislead.’ Illustration: Jean Jullien for the Guardian

Do you long to become a “thought leader”, thinkfluencing your way from TED talk to tech conference, lauded for your insights? I hope not. But if so, you could do worse than consult a paper published in 1971 by the maverick sociologist Murray Davis, entitled “That’s Interesting!” (I found it via Adam Grant.) What is it, Davis asks, that makes certain thinkers – MarxFreud,Nietzsche – legendary? “It has long been thought that a theorist is considered great because his theories are true,” he writes, “but this is false. A theorist is considered great, not because his theories are true, but because they are interesting.” Even in the world of academia, most people aren’t motivated by the truth. What they want, above all, is not to be bored.

Forty-three years on, this feels truer than ever. We live in the Era of Interestingness: attention is money, and purveyors of the interesting can make millions from Twitter feeds of amazing facts – even if they’re not always true facts – or from books or blogs offering intriguingly counterintuitive perspectives. (This column’s part of the problem, except I’ve yet to make millions.) Moreover, Davis argues, there are only a handful of main ways for an idea to be interesting. To grab people’s attention, you should argue that something we think of as bad is good, or vice versa; that some apparently individual phenomenon is really collective; that several seemingly disparate things are actually part of the same thing; and a few others. It’s unnerving how many thinkers can be pigeonholed this way. Christian morality seems good, Nietzsche argued, but really it’s bad. Mental disorders, dreams and slips of the tongue might seem unrelated, Freud said, but really they’re the result of the same inner drives. And on and on.

Clearly, this could be helpful information if you’re looking to intrigue friends, fascinate a potential lover, or keep your students engaged. But it’s also troubling. If you care about the truth, Davis suggests, interestingness can mislead. That new book on how to get fit – or raise happy children, or invest your savings – caught your eye because it’s interesting. But is it true? (In science, this helps explain the “file drawer effect”: studies with interesting conclusions get published; boring ones, however true, get locked away.) Ultimately, interestingness is a form of excitement, and we all know how excitement can lure us off course: consider the thrill of an extramarital affair, or of driving at 120mph. But it’s intellectually respectable excitement, so it doesn’t ring alarm bells.

Perhaps it should. When he gives talks, the spiritual author Eckhart Tolle likes to warn the audience that they may not find the experience interesting. He’s not simply lowering expectations. He means that constantly to chase after what’s interesting is to miss something crucial about life. Interestingness gives the mind something to chew on – but the best experiences come when you stop chewing. When you’re watching a stunning sunset, Tolle asks, “could you say, ‘This sunset is interesting’? Only if you were trying to write a PhD about sunsets… Truly look, and then what you’re looking at goes beyond interesting… There’s nothing interesting about it, and yet it’s awe-inspiring.”

My first reaction to that was, “How interesting! I must explore this topic further!” which just shows how addictive interestingness can be. The correct reaction, obviously, is to go and watch the sunset.

oliver.burkeman@theguardian.com
The Guardian, Saturday 5 April 2014