Long survey distribution strategy
Denmark intends to make the survey available to all students enrolled at a Danish university that have done an exchange in another country.
November 2016: Ministry will distribute the survey to the relevant students through the universities
December 2016/January 2017: Reminder to the same students
|24 February 2017: Now distributed to all Erasmus+ Coordinators at the Danish universities, which should be the right ones to address. I have asked them to put it on their webpage and best of all to send it directly to students who has been abroad in a shorter or longer period since September 2015.|
|Finland||Finland aims to distribute the survey to all students that have done an exchange in another country from all Finnish HEIs (both universities and universities of applied sciences) between the years 2013 – 2016|
The survey will be distributed using the following channels:
The stakeholder organisations will be asked to distribute the survey link by email (CIMO: email), post it on their website (HEIs: student portal) or use social media (ESN: facebook & twitter).
15th April, CIMO distributes the survey by email using their own mailing lists (cimppa and cimeoni)
15th April, ESN Finland will post the link to the survey on Facebook and Twitter.
CSC will send a reminder to the networks of international affairs (Pinnet and Aivoriihi) about the ongoing survey after the summer holidays in autumn 2016.
In addition, the link to the survey should be posted on the Emrex website.
18 April, CIMO and ESN Finland have received the letter for students and promised to distribute the email incl. the link to the survey.
Finland has succeeded in receiving a huge number of responses from different countries.
|Italy||Distribute the survey to all the students that have done an exchange outside Italy in the last couple of years.|
|Norway||All HEIs, current mobile students|
The students will be reached through the following channels:
We expect best (biggest) response through this channel, as SIU coordinates distribution of several other questionnaires.
The students will be contacted right after the start of the semester: in August or in September 2016 and in February 2017. Students are used to lots of communication with the HEIs at the beginning of each semester, and will be best motivated to answer the long survey at the beginning of the semester, as opposed to right before the exams.
Sweden intends to make the survey available to all students that have done an exchange in another country from all Swedish HEIs (both universities and university colleges). The main corresponding group will probably be students who made their studies between the years 2013 – 2016.
May, survey distributed to swedish HEIs included in reference group. They use their own mailing lists to students. Recomending HEIs to put the link at web sites etc.
May 27th, meeting, reminder to the same reference group.
20 Sept, the survey was promoted at the NUAK meeting in Stockholm (administrative personel) and the link distributed to all Swedish HEIs.
23 Sept 2016, the survey was promoted at the Yearly Swedish Erasmus meeting (administrative personel). All HEIs attending.
The procedure can/should be repeated during 2017 and then only include students that were doing their abroad studies 2016.
On the emrex.eu - whenever one of the partners starts sending invitations to students, the information about the survey should be moved to the section news on emrex.eu.
The survey will have to be resent in spring 2017.
Institutions involved in survey dissemination
|Institution||Country||Date when the dissemination began||How is it done (link/ e-mails/...)?|
|CIMO||Finland||18.4.2016||CIMO asked to send long survey requests to its contacts in HEIs, email|
|ESN||Finland||18.4.2016||ESN Finland will publish the link/letter on fb and twitter|
The research has been divided into two parts. Each has its own sample, methodology and research questions.
- Short survey built-in EMREX:
- questions asked just after the import of the academic record
- evaluation of the user experience
- Long (but no too long) survey independent from EMREX
- for EMREX users and non-users
- questions concerning: the process of the academic record recognition, the recognition rates, knowledge of EMREX, opinion on EMREX.
Long survey - recognition process evaluation
Objectives and research questions
The main goal of the long survey is to gather students’ opinion on the recognition process.
- Rates of recognised and unrecognised academic achievements (registry-based analysis will show only recognised academic achievement).
- Issues with recognition of academic achievement.
- Opinion on the process
- Administrative burden as a barrier preventing mobility (compared to other barriers)
- Awareness and usage of the EMREX
- Opinion on the EMREX (incl reasons for not using it)
The survey is hosted on a server of the University of Warsaw. It can be accessed via this link: https://ankieter.mimuw.edu.pl/en/surveys/81/.
Partners are responsible for promoting the survey in their countries. The survey is open, i.e. there is a link to it and everyone can take part in it. This makes it very easy to disseminate. The link should be posted on any site that exchange students may visit. These include: HEIs’ websites, IROs’ websites, students’ organisations’ websites, Facebook pages, etc. The survey will run until November 2017, so this not a onetime event and information about it should not be posted only in the news section of websites. Additionally HEIs could be asked to send emails to their exchange students during the periods when the highest number of students get their achievements recognised.
Long survey mid-report (summarising 2016 results): Long survey summary Feb 2017
Short survey - user experience evaluation
- Ease of use
- Clarity of instructions
- Speed of EMREX
- Technical problems
- Look and feel of EMREX
- Every user of EMREX
- After the import of records a student will be asked to fill in the questionnaire
Questionnaires and research techniques
questionnaire: Short survey - questionnaire and variable list
Final version is available on the server of Department of Mathematics, Computer Science and Mechanics, University of Warsaw:
The following has been agreed on:
- The student should get a message upfront about no personal data being sent to the questionnaire. The parameters sent will remain visible (i.e., non-encrypted) so that the student can verify that this is indeed the case.
- "host" is the country/institution hosting and delivering the results, "home" is the receiving side
- env: "dev" (development), "test" (pre-production), or empty meaning production
- home_institution and host_institution: schacHomeOrganization code (example: tut.fi)
- home_country and host_country: two letter code of the respective country
- date_of_import: ISO-format (YYYY-MM-DD)
- time_spent: in seconds, from the moment the student was displayed the client page, to the moment the questionnaire link was shown
The parameters host_institution, grades_imported and ects_imported can be comma separated tuples, in case when the NCP has delivered results from more than one institution.
Language versions of countries participating in the field trial will be available if partners deliver translations.
Summary - January 2017: Short survey summary Jan 2017
Language versions in surveys (by Łukasz Karniewski)
Here is how it's supposed to work. I'll skip the "?env=test&session_id=........." part in the examples.
The link to the survey can contain 2 optional language parameters.
The first one is the language of the application (buttons, built-in texts, etc.) - it can be either Polish or English.
The second one is the language of the survey (questions and answers) - in this particular case it can be one of: English, Norwegian, Swedish, Finnish, Danish or Polish.
The first parameter comes at the beginning of the path, right after the host name. The second comes at the end, after the survey ID. So for example, for the buttons to be in English and the survey to be in Norwegian, the link should be:
If the first parameter is missing, the application language will be picked based on request (browser settings). If neither of the two supported languages is present in the request, the default (Polish) will be used.
If the second parameter is missing, the survey language will be set to the first one from the list (in this case: English).
When a link with first parameter present is used (clicked or entered in the address bar), the application will extract the language info from the link, switch to that language (language info is saved in a cookie), and then remove it from the link. So, if you click this link: https://ankieter.mimuw.edu.pl/en/surveys/79/
The language should be switched to English, but the link in the address bar should be: https://ankieter.mimuw.edu.pl/surveys/79/
https://ankieter.mimuw.edu.pl/en/surveys/79/pl/ - buttons in English, survey in Polish
https://ankieter.mimuw.edu.pl/en/surveys/79/no/ - buttons in English, survey in Norwegian
https://ankieter.mimuw.edu.pl/en/surveys/79/ - buttons in English, survey in English
https://ankieter.mimuw.edu.pl/pl/surveys/79/ - buttons in Polish, survey in English
https://ankieter.mimuw.edu.pl/surveys/79/ - buttons in [depends on browser settings], survey in English
Please check your browser content language settings and try these links. The result should be as is described.
Counting reliable statistics about the number of EMREX users
At the end of the project we should be able to show how well we did during the field trial, which means we should deliver and compare two numbers (per each pair of countries):
- the number of students who used EMREX,
- the number of mobile students between partner countries.
The number of mobile students between partner countries should be counted on the basis of the Mobiltity Tool Report, which contains the list of mobilities, dates, home and host institutions. We should take into account only mobilities from the period of field trial, in particular exact dates EMREX was deployed, and only those institutions which participated, e.g. only two universities from Italy.
The number of students who used EMREX is more tricky to estimate.We have four sources of data:
- records from short survey - this is very good statistics, but not all students fill in the questionnaire,
- NCP logs - up to now delivered only by Sweden and Norway. These logs contain not only students using EMREX to transfer results but also tests by developers, students revisiting EMREX etc. An account is needed to log to NCP so the number of fake records shouldn't be large.
- SMP logs - up to now delivered only by Sweden and Norway. As in case of NCP might contain extra records from testing developers. The number of stored XML/PDF files also counts as part of SMP logs.
- Pamela or generally WP4 members who know students by name and invite them to use EMREX. Sometimes students use EMREX in presence of project members. The list of such students would be helpful. It may contain only a date, home and host country/institution.
The best strategy is to gather data from all those sources and compare.During the group meeting in Stockholm, December 2nd 2016, it was decided that we will proceed in two steps:
- In the first week of January 2017 partners will gather data from sources 2 and 3, clean it by deleting all records which were obtained from testing or any other records which should not be taken into account. Will send these data to Janina together with a short description of the way logs are created.
- The same procedure will be repeated at the end of the field trial so we will keep those clean logs from years 2015 and 2016 to avoid doing the same twice.
WP4 members should also send the list mentioned as the fourth source of data.
Janina will gather the data and do cross checking.
The final report from WP5 will be the source of the number of EMREX users officially delivered in the final report and in our publications.
Below there is a preliminary analysis based on data gathered up to November 30th, 2016. Only Short survey and NCP logs are compared. Data was cleaned by Janina only, still most probably contain test records.
|Home institution country||Host institution country||count||Home institution country||Host institution country||count|
Another set of data gathered up to January 18th, 2017. Only Short survey and NCP logs are compared. Data was cleaned by Janina only, still most probably contain test records.
|Home institution country||Host institution country||count||Home institution country||Host institution country||count|