1) A robot may not injure a human being or, through inaction, allow a human being to come to harm ... unless it makes a lot of money

2) A robot must obey the orders given it by human beings except where such orders would conflict with the First Law ... unless it makes a lot of money

3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law ... unless it makes a lot of money

@joshbressers this would make a great selling t-shirt if it was a bit more succinct.. maybe an ai could summarize it? 🤣 🤣 🤣