The Blockchain Art Exchange, based in London, is seeking to establish a “more democratic art market.”
The idea is that when a digital artist sells work on the Blockchain Art Exchange they will receive Blockchain Art Exchange (BAE) tokens.
Holders of BAE will receive royalty payments from every artwork sale that happens on the BAE exchange. Hence, the more sales an individual accrues, the larger the share of royalties received. In theory, emerging artists will be supported by royalties from the sales of more established artists.
The sequence is as follows:
1) Submit digital artwork for analysis.
2) The digital work is graded and given an “objective price.”
3) The digital work is uniquely identified with a blockchain certificate to prove authenticity.
4) The digital work can be exchanged instantly on the BAE platform.
Can such an idea be executed for traditional (non-digital) art forms?
This video represent a list of top 5 challenges that are common to many in the machine learning and data science community.
Scarcity of Data
The more data that can be used for modeling and predictions, the better.
This isn’t a problem for big companies, such as Facebook and Google. However, for many others, lack of sufficient data can limit their results rendering machine learning less productive.
Unclear Question
Vague questions will not result in substantive results. Data science is about recognizing patterns. So, clear questions are fundamental to defining what types of patterns to analyze.
Unclear Representation of Data
For a data scientist, the resulting work needs to be represented to the end user in meaningful ways. There is more room for libraries to make life easier to better represent data.
Expensive Resources
Computing millions of lines of code over and over can be expensive. But it should get less expensive in the future.
Machine learning Algorithm selection
Currently, the biggest challenge in data science is the selection of the right algorithms. It’s important to understand the algorithms as well as possible to determine which ones will most benefit your project.
Artificial General Intelligence (AGI) is in the process of rewriting what it means to live on planet earth.
AGI refers to the intelligence of a machine that could successfully perform any intellectual task that a human being can.
AGI is a goal of some artificial intelligence research, as well as a science fiction topic.
Kimera Systems, Inc. offers artificial intelligence technology to observe user behavior, context, and derive a common sense set of actions to apply under specific circumstances.
Mounir Shita, Co-Founder and CEO of Kimera Systems, speaks about AGI and a vision of how to develop an economic model that benefits people rather than a handful of powerful organizations or governments.
In this brief video he discusses using token economics and blockchain to pay individuals via the devices they use, which hold data. He says it’s about financially rewarding people for living their lives.
Machine learning is recognizing patterns in data and can help people support decisions.
Some practical ways that media and entertainment companies can apply artificial intelligence and machine learning is with contract management processing, budgeting, on-boarding production freelancers and standardization of digital asset management.
In brief, machine learning can help make an institution’s accumulated knowledge accessible to stakeholders.
The future of artificial intelligence (AI) encompasses risks and challenges.
The public perception of AI includes a fear that artificial intelligence might take away their job. And it’s true: AI is a tool people can use to operate business more efficiently. Of course some new jobs are created to support AI systems, but more people will lose jobs than new jobs are anticipated to be created.
Other risks for AI include data privacy concerns, government laws and regulations, and company stakeholders who don’t understand the technology.
AI is moving forward in spite of the challenges, but how smooth the transition will be for such an important shift in global technology and human labor is undetermined.
This NVIDIA video describes deep learning as the fastest-growing field in artificial intelligence, helping computers make sense of infinite amounts of data in the form of images, sound, and text.
Using multiple levels of neural networks, computers now have the capacity to see, learn, and react to complex situations as well or better than humans.
Specific real-world deep learning applications include the ability to analyze in one month, what used to take 10 years; voice-to-text technology; robotics; autonomous cars; and winning chess against a world champion.
Every industry will be impacted by deep learning, and many businesses are already delivering new products and services based on this new way of thinking about data and technology.
This video outlines a parallel between token economic models and business models. The central idea is that the way business models are designed is to provide value for customers while extracting value for shareholders. In a token model, the design is intended to create value in the token itself and therefore reward those supporting the business, including customers and shareholders.
However, part of the problem of a single-token model is the inherent tension between those who want the token to rise in value, like an investment, and those who simply want to pay for services, as with a currency that maintains a stable value.
Hence, the design of this two-token model is intended to satisfy the needs of two groups: customers and business supporters. This model is comprised of three key principles:
Reward value creators
Align actor incentives
Support business sustainability and expansion
Often blockchains are thought of as a way to disintermediate middlemen. However, some of them add value. Hence, this system is designed to reward all those who are adding value to the business, whether they are creators or traditional middlemen.
This specific 2-token system is designed for music blockchain Emanate. The two tokens are designated as MN8 and MNX.
MN8 is a governance token
MNX is a stable, internal cryptocurrency
Customers don’t care about MN8. They will swap money for MNX and use MNX to pay for services, just like traditional currency.
Different actors aligned with supporting the business will extract value from the system by receiving MNX as part of their provided services, according to pertinent smart contracts. They can then swap MNX for whatever currency they desire through an internal exchange.
However, they will need to stake some MN8 to participate in the system and earn MNX. Hence, they are buying into part ownership. Alternatively, they could be investing in the business itself.
All of the MN8 stakeholders can vote on the direction of the business. Those who own the most MN8 will have the most votes and the greatest vested interest in the success of the business and MN8 token. As the business becomes more successful, the MN8 tokens would be anticipated to rise in value, in the same way the value of stocks rise in a traditional, publicly traded corporation.
Security Token Academy Host, Amy Wan interviews Dr. Stephanie Hurder, partner and founding economist at Prysm Group about token economics and tokenizated assets.
Token economics, also known as crypto economics, is the study and design of economic systems based on blockchain technology.
Hurder express the importance of understanding the rights and governance of what a token provides.
There are two primary types of tokens:
1) Security tokens
2) Utility tokens
The major difference is the intended use and functionality of the tokens. Security tokens are created as investments. Users holding the security token also get ownership of the company.
Utility tokens, on the other hand, are not intended to give their holders the ability to control how decisions are made in a company. They merely enable users to interact with a company’s services.
However, both security and utility tokens can increase in value, hence some people may find it difficult to differentiate them.
Hurder talks about the tension inherent within a utility token when people are desiring that it go up in value, vs the stability required for its use of value exchange.
Token economics includes designing a token as a functioning part of an overall economic platform. In other words, ‘what’ do you tokenize so that everything works well together?
Every day, a large portion of the population is at the mercy of a rising technology, yet few actually understand what it is.
AI is designed so you don’t realize there’s a computer calling the shots. But that also makes understanding what AI is — and what it’s not — a little complicated.
In basic terms, AI is a broad area of computer science that makes machines seem like they have human intelligence.
It includes programming a computer to drive a car by obeying traffic signals.
The term “artificial intelligence” was first coined back in 1956 by Dartmouth professor John McCarthy. He called together a group of computer scientists and mathematicians to see if machines could learn like a young child does, using trial and error to develop formal reasoning. The project proposal says they’ll figure out how to make machines “use language, form abstractions and concepts, solve kinds of problems now reserved for humans, and improve themselves.”
But during the past few years, a couple of factors have led to AI becoming the next “big” thing: First, huge amounts of data are being created every minute. In fact, 90% of the world’s data has been generated in the past two years. And now thanks to advances in processing speeds, computers can actually make sense of all this information more quickly. Because of this, tech giants and venture capitalists have bought into AI and are infusing the market with cash and new applications.
When it comes to AI, a robot is nothing more than the shell concealing what’s actually used to power the technology.