What do you know about ipsos
Discover how through our explainer videos! Want more control? Explore the profile pages of our Research Choices partners and find out more info about participating companies' privacy policies, offered control and choice options, and additional company-specific information. An Introduction to Audience Measurement Research. Because Your Experience Matters. Without you, audience measurement research cannot accurately reflect the experience of users. Help us to improve the internet experience of every user, and especially you!
You really have to be able to put yourself in the shoes of the public or the shoes of any interested, you know, stakeholder or interested party who may be answering this question. JULIA CLARK [continued]: You know, if you're asking questions of lawyers, let's say, because you're doing a survey of lawyers, the terminology and the words you can use are going to be a little different, because there's a base sort of technology or a sort of linguistic knowledge that they have about their industry.
So you can incorporate some of that language. You would never use that same language in a survey. JULIA CLARK [continued]: to the general public, where you have to make sure that everybody-- lower education, higher education, younger, older, engaged, less engaged, informed, less informed-- all of those people are able to equally answer the question that you're asking in a way that's valid and legitimate.
Is there anything else we're missing in terms of this angle? Is there something else we should be thinking about for the ICC? SPEAKER 1: Well, something you brought up, which is that a lot of the reason that they've gotten a lot of traction from this is the sort of feminist press and seeing it as a major step for women.
This case also was decided by an entirely female panel of judges, which would be really, really interesting to talk about the fact that-- is it the fact that there were three female judges? Does that make a difference? Would it be different if it was a mixed panel? So this has a really strong sort of female or feminine or feminist component to the whole situation.
So in terms of next steps, I think this is exciting, and I think actually, per our other agenda around doing some interesting polls that Ipsos publishes on our own, this could be a really useful angle. But we'll be designing some new stuff, too. JULIA CLARK [continued]: So if you can pull anything that we've done in the past, we'll put it all in one big document, and I'll work on the questions we've just discussed and maybe we'll get them together and then sit down with our client and see what he thinks about all this.
But I think he's going to like it. You need absolutely somebody who has done it so much that they can sort of make a sign-off and make the determination, and in most cases now, that's me, but collaborative design is always optimal. This is for a few reasons. First of all, not everybody is an expert on everything.
JULIA CLARK [continued]: So when I am designing questionnaires about millennials' usage of social media, I need some of my millennials to participate in that survey-design process, because they're going to be closer to some of these issues than I am in some ways.
But it's not just that. It's not just about subject-matter expertise. It's about ensuring that the questions you're designing. And if we're talking about the general public, certainly, it's younger and older and male and female and educated and lower-educated and higher-income and lower-income and urban and rural and all the dimensions we have to consider when we're talking about America as a whole. If I sat there in isolation designing questions by myself,. And our team is great.
We have different strengths and weaknesses. So some of us bring a real political acumen to the game, and some of us really focus much more on the social sciences. And I have a psychological background or a psych background, and so I'm able to really ensure that the language that we're using is the most direct, the most simple, the least nuanced, the least biased.
We don't want words that are open to interpretation in a lot of different ways. We want questions that are clear and clean. One of us will take the first pass, send it to the group. A few others take a pass. Then we have a meeting. Then the client provides some input. Then we all take another pass. And each time we're making tweaks. You know, it's rarely a wholesale redesign, although that occasionally happens.
But you need everyone looking at these questions multiple times to really get them in the best shape. Hey, Mike. How is it going? I was hoping I could pick your brain later on. I wanted to get your input on some of the questionnaire sort of structuring issues we're having with the Reuters survey.
Would that be all right? I know you have done that before for some of the other stuff we've been working on, so it would be great to pick your brains.
We're going to just meet in the conference room at noon. Thank you. Bad questionnaire design is really the bane of my professional existence.
It is immensely frustrating. It is something we see, unfortunately, a lot, not so much in the professional realm of your top research companies. There's a lot of control that goes into how. And of course, in the age of the internet, everybody's questions are available. So there's a lot of external scrutiny. Some of the places you'll see biased polls or leading questions are-- often a media company or another type of organization will put just a little survey question on their website, and any of their readership can click in and ask it.
That's what we call a convenience sample-- i. There's no sort of validity or robustness or representivity to it. And often that's where you see a question that says sort of, how much do you like or dislike Donald Trump, right? But they wouldn't frame it that way. They'd say how much do you dislike Donald Trump, right? And that leads us down a path of assuming that the dislike is there. If you say, how much do you dislike X, the premise is dislike.
And so you're saying, I dislike them a lot, or I dislike them a little. But of course, what if you like the person, right? That's not controlled for within the context of the question. So biased question design is extremely problematic. And it can be obvious, like the example I just gave-- how much do you dislike X?
But it can also be very insidious and much more subtle. And it speaks to the nature of the setup of the question itself. We have to think about the context. If you ask a whole series of questions about-- let's use the ACA, or Obamacare as it's known-- have you been impacted by Obamacare? Do you think you're paying too much for health care? Or do you think it's become worse? These are all leading questions. You know, that's a leading series of questions.
So not only is each question a bit biased, but you've led the respondent to a conclusion, because you've given some sort of negative, perhaps, information about the outcome or an attitude. And that's leading and biased as well. Now, even worse is then if you only publish the responses to that one "do you approve or disapprove of Obama" question without indicating that the whole series of questions you asked in the lead-up are also leading.
This is why Ipsos is a strong adherent to the Transparency. It's really about making sure the way you design the questionnaire, the way you design the entirety of the survey, as well as all the questions up. JULIA CLARK [continued]: to and including published questions are made public, are made available to anybody who wishes to see them, so they can assess for themselves question bias. What we're doing here now is, I've got some of the team together, and what we're going to be doing is brainstorming some technical solutions to a problem we're having with one of our questionnaires.
It's really lengthy. And we're going to be brainstorming some data-driven ideas to reformat it and rework it in a way that makes it much more usable. So I've got some of the team here who bring some of that knowledge to the table, and we're just going to talk through those issues.
We just wanted to do a quick brainstorm about this issue we're having with the Reuters questionnaire. It's this huge, long Word document. It's cumbersome. It's difficult to search.
It's difficult to match it up with dates and data. So we really need, I think, a good technical solution, and I think Katie and Mike, the two of you, have not only experience with the database side of things but with the programming and sampling side, too. So I think the four of us together definitely. So, Eliza, can you just explain to us a little bit the current format, so we can start thinking through some solutions? So, currently, the questionnaire is in a Word format.
Due to the length of time that we've had this survey in field and due to programming restrictions, every question that we have ever asked is currently in one Word document, meaning that we are dealing with a questionnaire that, while it's only 15 minutes in length for the people taking it right now, it's actually over pages long. SPEAKER 1 [continued]: [LAUGHTER] So what we need to do is find a way to make this questionnaire easily searchable, to have a way to decide when questions were most recently live, because obviously sometimes we rotate certain values questions in and out over time, and of course, how to do all of this while also satisfying the programming restrictions that we.
And this is one of the biggest issues facing our industry right now, actually. It's a very substantive issue. It's very complicated. A lot of us are grappling with this in sort of day-to-day real-life research problems right now. There is an actual person sitting at a phone bank on the phone sort of calling-- you know, calling around saying, what do you think about X, Y, and Z? Whereas an online survey is in front of you on a screen or on a smartphone or tablet, and you're. And that has immense implications from a design perspective.
It seems maybe simple at the outset, but it's very fundamental, not only because of what we call interviewer affects or demand characteristics, which is that people may answer a little differently if they're talking to a person, maybe because they want to seem a little smarter or a little more interested or they.
So one of the factors we know that happens when we do online and phone comparisons is people are a little more negative online. They're more likely to say they're dissatisfied with things or disapprove of them, or they're more likely to say they're unhappy with something. It's not a huge factor, but it's notable, right? Online you get a little more negativity, right? So that may be a factor related to interviewing. But there is a broader and more fundamental effect that we call mode effects.
People simply answer questions differently. Again, this goes back to the psych principles, to some of the sort of fundamental psychology of not. You read something on a paper, and you answer it one way. And you talk to someone on the phone, and you're maybe answering a little bit differently for a huge host of reasons. A good example is the "don't know" response. Most survey questions have an option to say "I don't know. They don't want to give an answer. They just genuinely don't know. Great company to work for.
Very friendly staff and a good work balance. Easy to book and change shifts and also very flexible if you need a different start or finish time.
Cons No set minimum hours a week so you never know if you will have enough hours to work every week. Ipsos MORI is a good place to work between jobs just to earn a bit of extra money, and incredibly flexible with hours. However, the work itself is pretty soul destroying, plus the atmosphere is pretty competitive as every person is trying to get the most interviews they can in one shift.
Most of the time getting surveys is sheer luck but management and those around you make it sound like it's all in the introduction. Pros Decent pay, flexible hours. Cons Stressful, competitive, management are bullies. Telephone interviewing does not really suit me as I am not particularly naturally sociable. Although I have good communicative skills and speak several languages, I am not a natural conversationalist, and using a telephone is an un natural task for me.
Bad Management, Toxic Micromanagement Culture. Operations in Particular. A lot of blame game and strange employee lay offs with no explanation which makes job security very unsettling. Progression is also pretty non existent. Never even got the chance to work, as after going through assessment days, the recruitment process, filling in sheets and purchasing the equipment I needed headset and phone.
The day of my first shift, not only did I not get sent my log-in details but also got a phone call and was told since I am at school I cannot work. Stress over nothing, the communication is awful, don't get sent anything on time, spent so long filling in forms etc. Complete waste of time, do not recommend at all. They don't pay well either. Easy job - so boring. Preys on the vulnerable to get them in untested with a useless training programme - the trainer I had on my "assessment day" was solely there to massage his own ego.
There is no assessment or interview whatsoever. Lectured like children through a fake telephone script before being told we all had a job anyway. Most supervisors are untrained in basic people skills. Ego trip seems to be endemic from top to bottom - many supervisors also clearly think they're better than everyone else. Office manager will cut shifts and stop offering work if any concerns are brought to her and protects her favourites. The culmination of this came on the one and only social event I went to.
One member of staff, who was clearly friends with the office manager, punched another in the face and was allowed to work two days later as if nothing had happened. I left just before the coronavirus lockdown - staff members had stolen the hand sanitizer which was not replaced. Measures taken to make workplace Covid secure was to make one space between staff members - this still was nowhere near a 2 metre gap and I didn't feel safe.
Some interviewers are nice - other than that I am struggling to find any positives. Pros Flexible good for extra money as long as shifts are available Some supervisors were lovely and were the only thing giving me motivation to stay positive Easy work if you can put up with members of the public treating you like dirt Cons You are expendable Minimum wage Not extras or perks which is unusual for call centre except one job you get money for every complete survey Management quite rude and don't care about staff The call centre is dirty, seats and microphones falling apart, I had constant colds working here I saw one supervisor belittle an older woman for talking when he was about to go over what a new project was about he hadn't started yet and she wasn't the only one talking I saw him call her out numerous times for nothing The same supervisor stood over me behind my back while I wrote some notes after a call and loudly gave me a row for 'drawing pictures'.
I think he was one of those people that power trip over being a supervisor and liked to humiliate people The same supervisor also bragged about having a something year old gf while he was well Into his 50s or approaching retirement - he was a complete sexist pig and talked about other women he had been with in graphic detail Worked there for a year and a half, initially loved the flexibility and when there were plenty of shifts available I got full time hours. For a few months it was really quiet so barely got any work maybe 4 hour shifts a month.
You are supposed to book 3 shifts per week, one week only one shift was available to me. I got an email from the administrator - more Very uncomfortable place rife with bullying. There too many leaders n not enough followers. Basically it's like being at high and everyone hates you for being you.
You can schedule various shifts to fit your lifestyle. People there are friendly and helpful. Only negative is that because of the zero hour contract format, during slower business periods you may not be able to get enough shifts you want.
A good vibrant place to work.
0コメント