Business & Technology
Tes appoints Ali Nazarboland as Engineering Vice President
SOFIAH NICHOLE SALIVIO
News Editor
Tes has appointed Ali Nazarboland as Vice President of Engineering as it expands its Tes360 platform for schools and trusts.
Nazarboland joins the education technology group with more than 20 years of engineering leadership experience across financial technology, payments, insurance, the public sector and other regulated industries. He will lead Tes’s engineering function as the company develops its wider technology platform.
His appointment strengthens the senior leadership team as Tes places greater emphasis on Tes360, a connected platform designed to bring together information used by schools and multi-academy trusts. The product is intended to address problems caused by disconnected systems and help staff turn data into action.
Before joining Tes, Nazarboland oversaw global engineering teams of more than 350 engineers across Europe, the Middle East and Africa, the US and Asia-Pacific. His background includes scaling engineering organisations and modernising legacy technology systems.
Tes360 focus
Tes has been broadening its technology offering across the education sector, with software and services covering timetabling, special educational needs and disabilities provision, behaviour management, staff wellbeing, parents’ evenings, recruitment and professional development. Tes360 sits at the centre of that strategy, linking information across those functions.
The company, which has operated in education for more than a century, has sought to combine software products with editorial and sector insight through Tes Magazine. The latest leadership appointment suggests engineering remains central to that plan as schools and trusts seek clearer oversight across multiple systems.
Rod Williams, Chief Executive Officer at Tes, said: “Ali brings a combination of technical expertise and leadership experience to Tes. As we continue to scale Tes360, it’s vital that we have strong engineering leadership that can combine strategic thinking with deep technical understanding. What stood out about Ali is his passion for education and the opportunity to contribute to work that has a genuine societal impact.”
Sector background
Nazarboland’s career has spanned sectors where large engineering estates and regulation often shape product and infrastructure decisions. That experience is relevant to education technology, where suppliers are under pressure to integrate systems more effectively while giving school leaders access to information spread across administrative, pastoral and teaching functions.
For Tes, the appointment also reflects the operational demands of expanding a platform used by institutions managing large volumes of pupil, staff and school performance data. In that context, engineering leadership can influence how quickly products are updated, how legacy systems are connected and how consistently services run across different markets.
Williams said the appointment formed part of a broader technology investment, with Tes continuing to expand Tes360 and its wider education ecosystem as it seeks to deepen its software relationships with schools and trusts.
Nazarboland said the company’s stage of development and its education focus were key factors in his decision to join. “Tes is entering an important phase in its journey, and the opportunity to be part of that is a major draw for me. Throughout my career I’ve worked across a variety of sectors, but being able to apply technology in a way that has a meaningful impact on education and young people is particularly rewarding. I’m looking forward to working with the team to continue building scalable, high-performing engineering capabilities that support Tes’ ambitions.”
Business & Technology
UK firms race ahead on AI, but controls lag behind
Writer has published research showing that many large organisations are adopting AI faster than they are putting controls in place. The survey points to governance gaps, data security concerns and weak oversight of autonomous AI systems.
The findings are based on a survey of 2,400 executives and employees in large enterprises, and suggest many of the UK’s biggest businesses could be exposed to risk as staff turn to unapproved AI tools and companies struggle to supervise newer autonomous systems.
Among the most striking findings, two-thirds of C-suite leaders said they believed their organisation had already suffered a data leak or security breach caused by an employee using an unapproved AI tool. Only a third of executives were certain no such breach had taken place.
Employee responses pointed to widespread use of public or prohibited tools. One in three said they had entered proprietary, confidential or sensitive company information into a public AI tool, while 16% had used AI products explicitly banned by their employer.
Those behaviours appear to be linked to frustration with approved systems and pressure to deliver work quickly. Employees who used banned tools most often said they would use whatever was needed to get their work done. Others said approved tools were too poor to use or that enforcement was weak.
Executives also acknowledged limited oversight. More than a third said they did not have full visibility or control over which AI tools employees were actually using inside their organisations.
Reporting fears
The research also highlighted tension between staff and management over harmful AI outputs. More than a quarter of employees said they had seen an AI tool at work produce a result that was dangerously wrong, unethical or biased.
Yet three in ten said they did not feel safe reporting dangerous or unethical AI behaviour to their employer because they feared retaliation. That contrasts sharply with senior leaders’ views: 90% of executives believed employees were safe to speak up.
The gap suggests companies face not only technical and compliance risks, but also cultural problems as AI tools become more embedded in day-to-day operations. Staff may be reluctant to challenge systems or report problems if they think doing so could be seen as resistance to adoption.
Agent oversight
Writer’s survey placed particular emphasis on autonomous AI agents, which are starting to move from pilot projects into regular business use. Here too, the results showed a lack of confidence in internal controls.
More than a third of executives said they were not confident they could shut down an autonomous AI agent if it began causing financial or reputational harm. A similar share said their organisation still lacked a formal, documented plan for supervising AI agents.
Leaders identified the main governance concerns around these systems as security and data protection, employee training, transparency over how agents operate and explainability. Only a quarter ranked ethical alignment among their top concerns, despite wider concern in the survey about problematic AI outputs.
The report also suggested some AI strategies are being shaped as much by image as by internal readiness. Three-quarters of executives said their company’s AI strategy was driven more by public signalling than by practical internal direction.
That points to pressure on senior management to show progress on AI even when policies, oversight structures and employee safeguards remain incomplete. It also helps explain why some organisations may be seeing a rise in so-called shadow AI, where staff use tools outside approved channels to meet performance demands.
The commercial and personal stakes appear high. Six in ten leaders said an agent-driven error causing serious damage would cost a senior executive their job. Respondents most commonly identified the chief executive officer, chief information officer or chief technology officer as most likely to be affected.
The survey comes as companies face growing scrutiny over how they deploy generative AI and autonomous systems in customer service, internal operations and knowledge work. While many employers are pressing ahead with rollout, the data suggests governance has not kept pace with the technology’s spread across large organisations.
The findings indicate that oversight of AI use now extends beyond procurement and policy into day-to-day workforce behaviour, internal reporting culture and companies’ ability to intervene when automated systems go wrong.
Two numbers capture the tension most clearly: 67% of C-suite leaders believe their organisation has already suffered an AI-related data leak or breach through unapproved tools, and 35% of executives said they were not confident they could pull the plug on an autonomous agent causing financial or reputational damage.
Business & Technology
LUC launches ENGAGE3D for infrastructure consultations
LUC has launched ENGAGE3D, an immersive visualisation tool for community consultation on infrastructure projects, designed to help local people understand how proposed developments could appear in their area.
The system uses game-engine technology to create interactive 3D models of proposed schemes within real-world landscapes, displayed on a touchscreen television at consultation events. Users can move through a site at eye level, switch to a virtual drone view, and compare different layouts and scenarios.
ENGAGE3D can also be tailored for individual projects. Users can explore landmarks and selected viewpoints while switching between seasons, weather conditions, visibility settings and turbine speeds, alongside supporting media and annotations.
Each model draws on several datasets, including LiDAR terrain models, aerial imagery, the National Tree Map and photography, to reflect conditions on the ground. The approach is intended to give communities a clearer view of how planned infrastructure could alter local landscapes.
The launch comes as infrastructure developers face growing pressure to show residents what projects will look like before planning decisions are made. Visual impact is often a central issue in consultations on wind farms and other energy developments, particularly in rural areas.
One of the first organisations to adopt the system is Trydan Gwyrdd Cymru, the publicly owned renewable energy developer in Wales, which is using the technology in consultations on a series of new wind farm proposals across the country.
Residents can explore landscapes within an average 10-kilometre radius of each site through the model prepared for the developer. Trydan Gwyrdd Cymru commissioned LUC to apply the system so communities could better understand the visual change that might result if projects proceed to construction and operation.
Rob Booth outlined the thinking behind the product launch.
“At LUC, we believe that the best projects start with listening. Effective consultation builds understanding, strengthens trust, and helps communities feel part of shaping their future,” said Rob Booth, chief executive of LUC.
He added: “This is why we developed ENGAGE3D – an integrated service backed by 60 years of environmental consultancy expertise and robust GIS data. It is a tool that will facilitate meaningful conversations about development proposals and place communities at the heart of decision-making.”
Early use
Trydan Gwyrdd Cymru said the model has already been used at early-stage project introduction events and helped people examine the appearance of proposed turbines from both nearby locations and several kilometres away.
Dr Catrin Ellis-Jones described how the model is being used in those consultations.
“The 3D digital model is an excellent tool for visualising what a project can look like in the local landscape from close up, or from kilometres away. It helps provide context and illustrates how features such as trees and buildings, or topographic effects, can make turbines less apparent from some locations and more obvious from others,” said Dr Catrin Ellis-Jones, head of public involvement at Trydan Gwyrdd Cymru.
Ellis-Jones said the system also allows residents to compare new proposals with existing turbines where relevant and broadens access to technical planning material.
“It allows direct comparison with existing turbines where they exist, which people are often keen to see. It makes the data and designs we draw up easily accessible to a wide range of people, young and old, and in turn helps us gather informed and specific feedback on our proposals.
“It was appreciated by local people and stakeholders who participated in our early-stage project introduction events, and the 3D model will be updated through the iterative and consultative planning process, so people can also see our designs evolve,” she said.
LUC is an environmental consultancy offering planning, impact assessment, landscape design, ecology and geospatial services to public and private sector clients. The employee-owned firm has more than 300 staff across offices in London, Bristol, Edinburgh, Glasgow, Sheffield, Cardiff and Manchester.
Business & Technology
Banbury nursing home found in ‘breach of legal regulations’
Banbury Heights Nursing Home in Old Parr Road in Banbury was rated ‘requires improvement’ by the Care Quality Commission (CQC).
The home has received this rating for a fourth time running following inspections in 2022, 2023 and 2024.
The service was found in breach of legal regulation in relation to people’s safe care and treatment.
This was the same as their inspections in 2022 and 2023.
Inspectors found the provider did not make sure that medicines and treatments were safe and met people’s needs.
The inspection also revealed the service didn’t have equipment that supported safe care.
Patients records didn’t always reference people’s medical needs or document important information about being on a palliative pathway, the inspection report added.
Inspectors also found that some of the staff did not always embody the culture and values of their workforce and organisation.
READ MORE: Oxfordshire group to protest in London this weekend
Banbury Heights Nursing Home is a residential care home providing personal and nursing care to up to 58 people.
The service provides support to older and younger people with dementia, physical disabilities, sensory impairments, mental health needs, learning disabilities or autism in one adapted building.
At the time of the inspection there were 41 people using the service.
People living at Banbury Heights provided positive feedback about the service.
Comments included “I am well looked after here”, “staff are lovely” and “they look after me well here”.
People’s relatives provided mixed feedback about staffing levels and the condition of the building.
Among the comments were “oh yes, I feel there are [enough staff]”, and, “there are definitely less staff at night and it may affect his care”.
-
Oxford News4 weeks agoBanbury cake company with 400 year history shut down
-
Crime & Safety3 weeks agoBicester man denies sexually assaulting two young girls
-
Crime & Safety4 weeks agoBicester crash: Motorcyclist ‘seriously injured’ in hospital
-
UK News3 weeks agoTV tonight: Shetland meets CSI in a new drama about a disgraced cop | Television
-
UK News3 weeks agoStarmer says it ‘beggars belief’ he wasn’t told about Mandelson vetting failure as he faces Commons – UK politics live | Politics
-
Crime & Safety3 weeks agoYoung farmers club hosts fun farm competitions in Bicester
-
Crime & Safety3 weeks agoOxfordshire ‘hidden trap’ pothole leads to compensation payout
-
UK News4 weeks agoV&A faces calls to become living wage employer on eve of Stratford opening | V&A
