Leaving Bloomberg and Going Back to API Evangelist

https://apievangelist.com/2024/08/19/leaving-bloomberg-and-going-back-to-api-evangelist/

"I will be taking my previous API Evangelist foundation, combining it with what I learned at F5 and Bloomberg (both enterprises), but also what I learned at Postman (a startup) as well as via my 125+ Breaking Changes podcast episodes, then applying it all as a new podcast and a set of API services focused on helping the enterprise govern API operations." -- #KinLane

#api360 #apiEvangelist #api

Leaving Bloomberg and Going Back to API Evangelist

Friday was my last day at Bloomberg. I learned what I had come there to learn and now it is time for me to get back to my API Evangelist work. Moving forward, I will be taking my previous API Evangelist foundation, combining it with what I learned at F5 and Bloomberg (both enterprises), but also what I learned at Postman (a startup) through my conversations with numerous enterprise customers, as well as via my 125+ Breaking Changes podcast episodes, then applying it all as a new podcast (kinda sorta) and a set of API services focused on helping the enterprise govern API operations.

API Evangelist

Expanding the Definition of Our API Contracts

https://apievangelist.com/2024/07/06/expanding-the-definition-of-our-api-contracts/

"As a potential consumer of an API, is it enough that I the API producer only publish an OpenAPI and share that with you? No, of course not" -- #KinLane

#api360 #apiCatalog #apisJSON

Expanding the Definition of Our API Contracts

In recent years we’ve begun collectively using the phrase “API contract” to often describe the OpenAPI or AsyncAPI for our APIs. While I have been complicit in the adoption of this phrase, and support its usage anytime I can, I also feel that it also reflects much of what is deficient with the API Economy. I agree that an OpenAPI represents a contract between producer and consumer, but I am painfully aware of how this represents just the technical side of this contract, and much of the business side of things is taken for granted or not addressed at all.

API Evangelist

The Diff Between What JSON Schema and Spectral Provide When Mapping the API Landscape

https://apievangelist.com/2024/05/07/the-diff-between-what-json-schema-and-spectral-provide-when-mapping-the-api-landscape/

"While JSON Schema is broadly applied across any “JSON schema”, Spectral is very focused on OpenAPI and AsyncAPI. I would like to leave all that baggage at the door, and zoom out to think just about APIs.json, and governing the landscape it maps using JSON Schema and Spectral." -- #KinLane #apiEvangelist

#api360 #jsonSchema #spectral

The Diff Between What JSON Schema and Spectral Provide When Mapping the API Landscape

I am working to better define and shape when and how I apply both JSON Schema and Spectral rules. Both specifications are invaluable when it comes to API governance, and have significant overlap, but there are many ways in which they differ when it comes to trying to define, shape, and ultimately govern the API landscape. I don’t pretend to have the answers here, but I know enough to know these are complex technological specifications seeking to tame a sprawling API landscape, and I enjoy writing about how I am seeing things work in hopes of making just a little more sense of it all.

API Evangelist

Zuckerman vs: Zuckerberg: why and how this is a battle of the public understanding of APIs, and why Zuckerman needs to lose and Meta needs to win

Imagine that you’re a cool, high-school, technocultural teenager; you’ve been raised reading Cory Doctorow’s “Little Brother” series, you have a 3D printer, a soldering iron, you hack on Arduino control systems for fun, and you really, really want a big strobe light in your bedroom to go with the music that you blast-out when your parents are away.

So you build a stepper-motor with a wheel and a couple of little arms, link it to a microphone circuit which does a FFT of ambient sound, and hot-glue the whole thing to your bedroom lightswitch so that the wheel’s arms can flick the lightswitch on-and-off in time to the beat.

If you’re lucky the whole thing will work for a minute or two and then the switch will break, because it wasn’t designed to be flicked on-and-off ten times per second; or maybe you’ll blow the lightbulb. If you’re very unlucky the entire switch and wiring will get really hot, arc, and set fire to the building. And if you share, distribute, and encourage your friends to do the same then you’re likely to be held liable in one of several ways if any of them suffer cost or harm.

Who am I?

My name’s Alec. I am a long-term blogger and an information, network and cyber security expert. From 1992-2009 I worked for Sun Microsystems, from 2013-16 I worked for Facebook, and today I am a full-time stay at home dad and part-time consultant. For more information please see my “about” page.

What does this have to do with APIs?

Before I begin I want to acknowledge the work of Kin Lane, The API Evangelist, who has been writing about the politics of APIs for many years. I will not claim that Kin and I share the same views on everything, but we appear to overlap perspectives on a bunch of topics and a lot of the discussion surrounding his work resonates with my perspectives. Go read his stuff, it’s illuminating.

So what is an API? My personal definition is broad but I would describe an API as any mechanism that offers a public or private contract to observe (query, read) or manipulate (set, create, update, delete) the state of a resource (device, file, or data).

In other words: a lightswitch. You can use it to turn the light on if it’s off, or off if it’s on, and maybe there’s a “dimmer” to set the brightness if the bulb is compatible; but lightswitches have their physical limitations and expected modes of use, and they need to be chosen or designed to fit the desired usage model and purpose.

Perhaps to some this definition sounds a little too broad because it would literally include referring to (e.g.) “in-browser HTML widgets and ‘submit’ buttons for deleting friendships” as an “API”; but the history of computing is rife with human-interface elements being repurposed as application-interfaces, such as banking where it was once fashionable to link new systems to old backend mainframes by using software that pretends to be a traditional IBM 3270 terminal and then screen-scraping responses to queries which were “typed” into the terminal by the new system.

The modern equivalent for web-browsers is called Selenium WebDriver and is widely used by both automated software testers and criminal bot-farms, to name but two purposes.

So yes: the tech industry — or perhaps: the tech hacker/user community — has a long history of wiring programmable motors to lightswitches and hoping that their house does not catch on fire… but we should really aspire to do better than that… and that’s where we come to the history of EBay and Twitter.

History of Public APIs

In the early 2000s there was a proliferation of platforms that offered various services — “I can buy books over the internet? That’s amazing!” — and this was all before the concept of a “Public API” was invented.

People wanted to “add-value” or “auto-submit” or “retrieve data” from those platforms, or even to build “alternative clients”; so they examined the HTML, reverse-engineered the functions of Internal or Private APIs which made the platform work, wrote and shared ad-hoc tools that posted and scraped data, and published their work as hackerly acts of radical empowerment “on behalf of the users” … except for those tools which stole or misused your data.

Kin Lane particularly describes the launch of the Public APIs for EBay in November 2000 and for Twitter in September 2006; about the former he writes:

The eBay API was originally rolled out to only a select number of licensed eBay partners and developers. […] The eBay API was a response to the growing number of applications that were already relying on its site either legitimately or illegitimately. The API aimed to standardize how applications integrated with eBay, and make it easier for partners and developers to build a business around the eBay ecosystem.

link

…and regarding the latter:

On September 20, 2006 Twitter introduced the Twitter API to the world. Much like the release of the eBay API, Twitter’s API release was in response to the growing usage of Twitter by those scraping the site or creating rogue APIs.

link

…both of which hint at some issues:

  • an ecosystem of ad-hoc tools that attempt to blindly and retrospectively track EBay’s own platform development would not offer standardisation across the tools that use those APIs, and so would thereby actually limit potential for third-party client development; each tool would be working with different assumed “contracts” of behaviour that were never meant to be fixed or exposed to the public, and would also replicate work
  • proliferation of man-in-the-middle “services” that would act “on your behalf” — and with your credentials — on the Twitter and EBay platforms, presented both a massive trust and security risk to the user (fraudulent purchases? fake tweets? stolen credentials?) with consequent reputational risk to the platform
  • Why do Public APIs exist?

    In short: to solve these problems. Kin Lane writes a great summary on the pros-and-cons of Public APIs and how they are used both to enable, but also to (possibly unfairly) limit, the power of third party clients that offer extra value to a platform’s users.

    But at the most fundamental level: Public APIs exist in order to formalise contracts of adequate means by which third-parties can observe or manipulate “state” (e.g.; user data, postings, friendships, …) on the platform.

    By offering a Public API the platform frees itself also to develop and use Private APIs which can service other or new aspects of platform functionality, and it’s in a position to build and “ring-fence” the Public API service in the expectation of both heavy use and abuse being submitted through it.

    Similarly: the Private APIs can be engineered more simply to act like domestic light-switches: to be used in limited ways and at human speeds; it turns out that this can be important for matters like privacy and safety.

    Third parties benefit from Public APIs by having a guaranteed set of features to work with, proper documentation of API behaviour, and confidence that the API will behave in a way that they can reason about, and an API lifecycle management process with which will enable them to make their own guarantees regarding their work.

    What is the Zuckerman lawsuit?

    First, let me start with a few references:

    The shortest summary of the lawsuit that I have heard from one of its ardent supporters, is that the lawsuit:

    […] seeks immunity from [the Computer Fraud and Abuse Act] and [the Digital Millennium Copyright Act] [for legal] claims [against third parties or users] for automating a browser [to use Private APIs to obtain extra “value” from a website] and [the lawsuit also] does not seek state mandated APIs, or, indeed, any APIs

    (private communication)

    To make a strawman analogy so that we can defend its accuracy:

    Let’s build and distribute motors to flick lightswitches on and off to make strobe lights, because what’s the worst that could happen? And we want people to have a fundamental right to do this, because Section 230 says we have such a right. We won’t be requiring any new switches to be installed, we just want to be allowed to use the ones that are already there, so it’s easy and low-cost to ask for, and there’s no risk to us doing this. But we also want legal immunity just in case what we provide happens to burn someone’s house down.

    In other words: a return to the ways of the early 2000s, where scraping data and poking undocumented Private APIs was an accepted way to hack extra value into a website platform. To a particular mindset — especially the “big tech is irredeemably evil” folk — this sounds great, because clearly Meta intentionally prevents your having full, automated remote control over your user data on the grounds that it’s terribly valuable to them, and their having it keeps you addicted, so it helps them make money

    And you know what? To a very limited extent I agree with that premise — or at least that some of the Facebook user-interface is unnecessarily painful to use.

    E.g. I feel there is little (some, but little) practical excuse for the heavy user friction which Facebook imposes upon editing of the “topics you may be interested in receiving adverts about“; but the way to address this is not to encourage proliferation of browser plugins (of dubious provenance regarding privacy and regulatory compliance, let alone uncertain behaviour) which manipulate undocumented Private APIs.

    Apart from any other reason, as alluded above, Private APIs are built in the expectation of being used in a particular way — e.g. by humans, at a particular cadence and frequency — and on advanced platforms like Facebook they are engineered with those expectations enforced by rate limits not only for efficiency but also for availability, security and privacy reasons.

    This is something which I partially described in a presentation on behalf of Facebook at PasswordCon in 2014, but the short version is: if an API is expected to be used primarily by a human being, then for security and trust purposes it makes sense to limit it to human rates of activity.

    If you start driving these Private APIs at rates which are inhuman — 10s or 100s of actions per second — then you should and will expect them to either be rate-limited, or else possibly break the platform in much the same way that flicking a lightswitch at such a rate would break that lightswitch or bulb.

    With this we can describe the error in one of the proponent’s claims: We aren’t requiring any new [APIs] to be installed, we just want to be allowed to use the ones that are already there — but if the Private API is neither intended nor capable of being driven at automated speeds then either something (the platform?) will break, or else there will be loud demands that the Private APIs be re-engineered to remove “bottlenecks” (rate limits) to the detriment of availability and security.

    But if you will be calling for the formalisation of Private APIs to provide functionality, why are you not instead calling for an obligation upon the platform to provide a Public API?

    Private APIs are not Public APIs, and Public APIs may demand registration

    The general theme of the lawsuit is to demand that any API which a platform implements — even undocumented Private ones — should be legally treated as a Public API, open for use by third party implementors, without reciprocal obligation that the third-party client obtain an “API Key” to identify itself, nor to abide by particular behaviour or rate-limits.

    In short: the lawsuit demands that all APIs, both Public and Private, should become “fair game” to third party implementors, and the Platforms should have no business to distinguish between one third-party or another, even in the instance that one or more of them are malicious.

    This is a dangerous proposal. Platforms innovate new functionality and change their Private API behaviour at a relatively rapid speed, and there is currently nothing to prevent that; but if a true “right to use” for a Private API becomes somehow enshrined, what happens next?

    Obviously: any behaviour which interferes with a public right-to-use is illegal, so it will therefore become illegal to change or remove Private APIs — or at very least any attempt to do so will lead to claims of “anticompetitive behaviour” and yet more punitive lawsuits. The free-speech rights of the platform will be abridged by compulsion to never change APIs, or to support legacy-publicly-used-yet-undocumented APIs forever more.

    So, again, why not cut this Gordian knot by compelling platforms to make available a Public API that supports the desired functionality? After all, even Mastodon obligates developers of third-party apps to register their apps before use; but somehow big platforms should accept and and all non-human usage of Private APIs without discrimination?

    Summary

    I don’t want to keep flogging this horse, so I am just going to try and summarise in a few bullets:

  • Private APIs exist to provide functionality to directly support a platform; they are implemented in ways which reflect their expected (usually: human) modes of use, they are not publicly documented, they can come and go, and this is normal and okay
  • Public APIs exist to provide functionality to support third-party value-add to a platform; they are documented and offer some form of public “contract” or guarantee of behaviour, capability, and reliability. They are often designed in expectation of automated or bulk usage.
  • Private APIs do not offer such a public contract; they are not meant to be built upon other than by the platform itself. They are meant to be able to “go away” without fuss, but if their use becomes a guaranteed “right” then how can they ever be deprecated?
  • If third parties want to start using Private APIs as if they were Public APIs then the Private APIs will probably need to be re-engineered to support the weight of automated or bulk usage; but if they are going to be re-engineered anyway, why not push for them to become Public APIs?
  • If Private APIs are not re-engineered and their excessive automated use by third party tools breaks the platform, why should the tool-user or the tool-provider not be held at least partly responsible as would happen in any other form of intentional or unintentional Denial-of-Service attack?
  • If some (in-browser) third party tools claim to be acting “for the public good” then presumably they will have no problem in identifying themselves in order to differentiate themselves from (in-browser) evil cookie-stealing malware and worms; but to differentiate themselves would require use of an API Key and a Public API — so why are the third-party tool authors not calling to have the necessary Public APIs?
  • Just because an academic says “I wrote a script and I think it will work and that I [or one of your users] should be allowed to run it against your service without fear of reprisal even though [we] don’t understand how the back end system will scale with it”— does not mean that they should be permitted to do so willy-nilly, not against Facebook nor against your local community Mastodon instance.

    https://alecmuffett.com/article/109757

    #apis #ethanZuckerman #kinLane #meta #scraping

    Little Brother (Doctorow novel) - Wikipedia

    Zuckerman vs: Zuckerberg: why and how this is a battle of the public understanding of APIs, and why Zuckerman needs to lose and Meta needs to win
    https://alecmuffett.com/article/109757
    #EthanZuckerman #KinLane #apis #meta #scraping
    Zuckerman vs: Zuckerberg: why and how this is a battle of the public understanding of APIs, and why Zuckerman needs to lose and Meta needs to win

    Imagine that you’re a cool, high-school, technocultural teenager; you’ve been raised reading Cory Doctorow’s “Little Brother” series, you have a 3D printer, a solderin…

    Dropsafe

    Moving API Docs From Human-Readable to Machine-Readable https://buff.ly/3TPK2KV

    "I have more faith that we will get humans to adopt or at least auto-generate OpenAPI and APIs.json than I believe we’ll develop AI to unwind this mess." -- #KinLane

    #api360 #apiDocs #apisJSON

    Moving API Docs From Human-Readable to Machine-Readable

    One of the super powers of APIs.json is the ability to evolve the human-readable aspects of API operations into machine-readable ones–as this is how we are going to scale to deliver the API economy all of us API believers envision in our minds eye. I saw what Swagger (now OpenAPI) had done for API documentation back in 2013, and I wanted this for the other essential building blocks of our API operations. A decade later I am still translating our getting started, plans, SDKs, road map, change log, and support into machine-readable artifacts as part of our API Commons work, but I am still working to translate documentation into machine-readable artifacts as well.

    API Evangelist

    What is API Governance?

    https://apievangelist.com/2024/02/22/what-is-api-governance/

    "For me, API governance is equal parts about the artifacts and the process, but is all about the people landscape across the organization and industry where API governance is being applied." -- #KinLane

    #api360 #apiPlatform #apiGovernance

    What is API Governance?

    One thing I like about this blog format is that I can use the title, “What is API Governance”, over and over. Each blog post has the date timestamped in the title, so It captures my thoughts on the subject over time. Using my blog to work through complex concepts over time is how I govern the velocity of my career, speeding up and slowing down as required, to help me find the signal in the noise. I am the API governance lead at Bloomberg. I have helped define and shape the world of API governance as Chief Evangelist at Postman, my work on OpenAPI, AsyncAPI, JSON Schema, Spectral, and as the API Evangelist. I should know what API governance is, but I know enough to know that this definition is fluid and changes over time. So I am perpetually looking to ask this question and continue to test my understanding along the way.

    API Evangelist

    Where Is This API Gateway Thing Going?

    https://apievangelist.com/2023/11/12/where-is-this-api-gateway-thing-going/

    "A modern API gateway will need to be the bridge between business and engineering with a native product centered experience that closes the divide that has existed for decades with the help of a platform team." -- #KinLane

    #api360 #apiPlatform #apiEvangelist

    Where Is This API Gateway Thing Going?

    According to Gartner, the full lifecycle API management quadrant is going away, but acknowledges in 2023 the API gateway continues to enjoy an outsized amount of focus when it comes to internal and public API operations. The gateway is important, and tends to be where attention is focused in good times and bad, but other stops along the API life cycle are also critical to the conversation as well, and I’m looking to understand more about why, while trying to understand where things might be going.

    API Evangelist

    What Comes After Microservices?

    https://apievangelist.com/2023/01/25/what-comes-after-microservices/

    "So, instead of just voting for the demise of microservices and focusing on what comes next, let’s continue to do the work to standardize how we are working today—-ensuring things are more interoperable by default."

    #api360 #microservices #kinLane

    What Comes After Microservices?

    Good API trends can linger on for some time, but as with the decline of monolith, many love to anticipate the death of microservices through strangulation by its own distributed weight. While event-driven, circuit-breakers, GraphQL, and other leading patterns will continue to help us orchestrate successfully across this chaos, there are some other elements of our API operations that will be shaping what the API landscape looks like in the coming years.

    API Evangelist
    👍 Using Your Lack Of Trust To Take Advantage Of You – Read Write Collect

    Read Write Collect