The challenges facing primary education are significant: a growing teacher shortage, relatively high administrative burdens that contribute to work-related stress and an increasing diversity of children in the classroom. A promising new technology that can help teachers and children meet these challenges is the social robot. These physical robots often use artificial intelligence and can communicate with children by taking on social roles, such as that of a fellow classmate or teaching assistant. Previous research shows that the use of social robots can lead to better results in several ways than when traditional educational technologies are applied. However, social robots not only bring opportunities but also lead to new ethical questions. In my PhD research, I investigated the moral considerations of different stakeholders, such as parents and teachers, to create the first guideline for the responsible design and use of social robots for primary education. Various research methods were used for this study. First of all, a large, international literature study was carried out on the advantages and disadvantages of social robots, in which 256 studies were ultimately analysed. Focus group sessions were then held with stakeholders: a total of 118 parents of primary school children, representatives of the robotics industry, educational policymakers, government education advisors, teachers and primary school children contributed. Based on the insights from the literature review and the focus group sessions, a questionnaire was drawn up and distributed to all stakeholders. Based on 515 responses, we then classified stakeholder moral considerations. In the last study, based on in-depth interviews with teachers who used robots in their daily teaching and who supervised the child-robot interaction of >2500 unique children, we studied the influence of social robots on children's social-emotional development. Our research shows that social robots can have advantages and disadvantages for primary education. The diversity of disadvantages makes the responsible implementation of robots complex. However, overall, despite their concerns, all stakeholder groups viewed social robots as a potentially valuable tool. Many stakeholders are concerned about the possible negative effect of robots on children's social-emotional development. Our research shows that social robots currently do not seem to harm children's social-emotional development when used responsibly. However, some children seem to be more sensitive to excessive attachment to robots. Our research also shows that how people think about robots is influenced by several factors. For example, low-income stakeholders have a more sceptical attitude towards social robots in education. Other factors, such as age and level of education, were also strong predictors of the moral considerations of stakeholders. This research has resulted in a guideline for the responsible use of social robots as teaching assistants, which can be used by primary schools and robot builders. The guideline provides schools with tools, such as involving parents in advance and using robots to encourage human contact. School administrators are also given insight into possible reactions from parents and other parties involved. The guideline also offers guidelines for safeguarding privacy, such as data minimization and improving the technical infrastructure of schools and robots; which still often leaves much to be desired. In short, the findings from this thesis provide a solid stepping stone for schools, robot designers, programmers and engineers to develop and use social robots in education in a morally responsible manner. This research has thus paved the way for more research into robots as assistive technology in primary education.