Tuesday, November 26th

    The Impact of Human Native AI on the Marketplace for AI Training

    img
    Human Native AI, a London-based startup, facilitates licensing arrangements for AI-training content, allowing AI companies to find and compensate rights holders, while undercutting each trade.

    For AI systems and huge language models to be correct, they must be trained on vast volumes of data; however, they shouldn't be trained on data that they are not authorized to use. These licensing arrangements for AI-training content demonstrate the desire on both sides of the table, as demonstrated by OpenAI's agreements last week with The Atlantic and Vox.

    London-based startup Human Native AI is building a marketplace to facilitate such transactions between the many companies building LLM projects and those willing to license material to them.

    The goal is to help AI companies find material to train their models while allowing rights holders to participate and be compensated. Rights holders upload content for free and engage with AI companies for revenue sharing or subscription deals. Human-native AI also helps rights holders prepare and evaluate their content and monitor potential copyright infringements. Human-native AI undercuts each trade and charges AI companies a fee for trading and monitoring services.

    James Smith, CEO and co-founder, told TechCrunch that he got the idea for human-native AI from his past experience working on Google’s DeepMind project. DeepMind also ran into issues with not having enough good data to properly train the system. Then he saw other AI companies run into the same issue.

    “It feels like we are in the Napster era of generative AI,” Smith said. “Can we get to a better era? Can we make it easier to acquire content? Can we give creators some level of control and compensation? I kept thinking, Why is there not a marketplace?”. He pitched the idea to his friend Jack Galilee, an engineer at GRAIL, over a walk in the park with their respective kids, as Smith had with many other potential startup ideas. But unlike past times, Galilee said they should go for it.

    The company launched in April and is currently in beta. Smith said the demand from both parties has been really encouraging, and they have signed a number of partnerships that will be announced in the near future.

    This week, the locals of the people announced a layer of seed led by Local globe and Mercuri, which is £2.8 million. Smith said the company plans to use the funds to set up its own team. "I was the CEO of a two-month-old company, and I got to meet with the CEO of a 160-year-old publishing house," Smith said. "This shows me that the demand for publishing is high. Similarly, every dialogue with large AI companies is exactly the same."

    Although it is still very early, the local AI seems to be formed by a lack of infrastructure in the development of the AI industry. A large AI player needs a lot of data training, which provides a calmer way to work with them for rights.

    Another interesting part here is the future potential for data collected by human local artificial AI. Smith said they will in the future be able to make real holders more clear about their content based on the historical items over transaction data on the platform.

    This is also a wise era for human local artificial AI. Smith said that with the development of EU AI law, the potential AI rules in the US are on the road, and AI companies buy data morally and have receipts to prove that it only becomes more urgent. "We're optimistic about the future of AI and what it's going to do, but we have to make sure that as an industry we're being responsible and not destroying the industries that got us to this point," Smith said. “It would not be good for human society. We have to make sure we find the right ways to engage people. We are AI optimists on the side of humanity.


    Tags :