Wormwood - An Explicit Way to Test Absinthe GraphQL APIs

    We love GraphQL at Tinfoil! We use it extensively in our Elixir and Phoenix powered API scanner. We try to test out code using ExUnit whenever possible to help ensure a stable and smooth development cycle. Testing an Absinthe GraphQL API usually follows the pattern of: Setting up a ConnCase, making the request, and then validating that the data returned from the request was valid. A lot like the following:

    This works well! But as a result we may accumulate a whole lot of boilerplate code for setting up these Plug Conns and parsing their results. Not to mention the frequent module attributes and strings used to contain our GraphQL queries inside the unit test module itself. We figured there’s probably a better way to write these sorts of tests that can leverage the power of the Absinthe library itself, rather than sending HTTP requests during a test run.

    After some experimentation we created Wormwood, a small and open source Elixir library to assist with Absinthe GraphQL document testing in ExUnit. We can eliminate large chunks of Plug.Conn boilerplate, remove static strings of query code, explicitly scope a module to a single query document and schema, and call our GraphQL API like this instead:

    In the above snippet, Wormwood is loading, parsing and validating the query document at compile time, then, it runs that loaded query against the specified schema using Absinthe itself. With this method of GraphQL testing, it’s super explicit which query document we are testing, it’s also clear which schema we are executing it against. We also gain the benefit to test against the errors that Absinthe can return at the different phases of the pipeline, and even control the pipeline itself! (More on that later.)


Using Wormwood in Your Own App

    We’ll break down each of the steps for utilizing Wormwood in your own testing setup. You can get (and contribute to!) Wormwood on our GitHub! Wormwood is also available on the Hex package repository. Some of the example code shown in this post is available in the repo itself.

    The first requirement for using Wormwood is to break out your queries into individual GQL files. While this may sound excessive, it offers a lot of power in terms of code coverage, and project organization. You can read more about using individual GraphQL files with Webpack in the Apollo docs, it’s pretty simple to set up. Once you have all your documents broken out into files, we can select the ones we want to test, and make accompanying ExUnit test modules for them.

Let’s say we have this GQL document:

And our imported fragment is just a simple set of reusable attributes, it looks like this:

We can then create a basic ExUnit test module we’ll call “get_users_test.exs”, and we’ll load our schema and the GQL document into the module using the “load_gql/2” macro.

Let’s break down what Wormwood is doing here:

    When “load_gql/2” is executed it attaches two special  attributes to the module it was called from, they are assigned the Absinthe schema from the first argument, and the full source text of the query file from the second argument. Wormwood will expand all import statements it can find, and will raise an exception if it cannot find a file or if it could not validate the syntax of the full query with Absinthe.

Now that our module has a schema and document assigned, we can query it using the simple “query_gql/1” function:

Our query results now live in the “query_data” variable. If we inspect it, we can see that our results are returned in vanilla Elixir lists and maps:

    If you have a deeply nested structure, a good tip is to use the Elixir Kernel function “get_in\2”, which takes the structure you want to access data from, and a list of keys or access functions to retrieve specific members. For example, if I wanted to fetch the id of the first user in this big query result I could simply do the following:


More Advanced Queries


    The above example is just a simple demo of how to use Wormwood in your testing suite. Wormwood supports a few more options and configurations when testing. Below is a quick list of the features, along with snippets to show how it’s done. You can also dig around in the examples folder on the GitHub repo.

Running a Query With Variables and Context


   You can pass the same options keyword list you would pass to “Absinthe.run/3” into the “query_gql/1” function. Refer to the Absinthe docs on the exact options, and their usage. If we wanted to pass a variable into this query we can just leverage the options Absinthe provides:

The same can be done for context if you want to fake something like authentication:


Running a raw string, rather than a GQL file


    Of course if you don’t want to break out your GQL documents into files, you can still assign them to a test module as a raw string. Rather than calling “load_gql/2”, call “set_gql/2”:

Wormwood will still expand import statements when using raw strings! They will be relative to the current working directory, which is usually your app root directory.


Running a Query With a Custom Absinthe Pipeline


    If at any point you wish to modify the pipeline that Absinthe uses for executing a document loaded into a module, you can do so by composing a list of pipeline Phases, and passing them into the “query_gql_with_pipeline/2” function like so:

  

   Wormwood was born out of specific quirks we ran into while testing our API Scanner, we hope you find it useful as well! It aims to help accelerate and improve the way GraphQL tests are written within ExUnit. If you have stars, issues or contributions, feel free to leave any of them on the official GitHub repo!


Aaron Shea

Aaron is our Resident Robot, and fittingly he's crazy about automation. He's always investigating new ways to improve a developer's workflow. When he's not forging code tools, he's probably researching graphics and animation. Aaron studied Computer Science at University of Hartford.

Tags: graphql


Announcing GraphQL Security Scanning

For the second time this year: API security scanning changes today. We’ve been working hard on adding support to scan GraphQL APIs for security vulnerabilities, best practices, and correctness. Earlier this year, the Tinfoil Security API Scanner initially launched with support for the Swagger documentation format, and we’re excited to expand coverage to now include GraphQL APIs. To be clear, we’re not deprecating support for OpenAPI scanning - in fact, OpenAPI specification v3 support is coming soon! We’ve enjoyed building our own GraphQL APIs to power our user interfaces and we felt the need to ensure their correctness as we built them. To that end we’ve added first-class GraphQL support to our API Scanner. We’re thrilled today to announce the beta of our GraphQL scanning capabilities at the GraphQL Summit conference in San Francisco.

We use GraphQL internally to iterate quickly with our user interfaces without huge changes to the backend server each time. (We’re hiring, if Elixir, GraphQL, and Vue are interesting to you, by the way). GraphQL makes it easy to decouple user interface needs from a backend API server by offering a buffet of data and relationships without restricting the format to a specific JSON payload. Nowadays UI developers can iterate quickly, but this puts extra load on API server engineers to make a performant, and most importantly safe, GraphQL API.

One huge advantage of GraphQL APIs is that they are self-documenting. Most GraphQL APIs can be introspected to pull out the types, fields, and mutations. This can make it a joy to work with a tool like GraphiQL to explore an API, but also makes it very easy to get started scanning. All you need to do is provide the GraphQL endpoint and the Tinfoil Security API scanner will do the rest. We automatically discover the different types, fields, arguments, and mutations exposed by your API, and generate an optimized set of documents to exercise all of the different aspects of your API.

In addition to searching for various injections (both direct and blind), we also look for GraphQL-specific concerns. One such concern is cycles in the query graph, potentially DoSing an API server with a request that is time consuming to fulfill. GraphQL allows you to set complexity limits on documents received from the client to help prevent this, and our scanner makes sure your API server has a reasonable complexity limit set. When not auditing high-complexity queries, we make sure the documents we generate balance simplicity with API coverage.

Our support for GraphQL is only beginning; please stay tuned for more developments! If you’re interested in joining the beta of our GraphQL Scanning, please drop us a line.


Ben Sedat

Ben Sedat is the Engineering Wizard of Tinfoil Security. He's a bit of a blend between a traditional software engineer (builder) and security engineer (breaker). He spends a lot of time thinking about security: both detection as well as creating solutions for the security issues that exist in software and the internet. He also plays lots of video games. Lots.

Tags: graphql


TechNet Cyber 2019: Here we come!

Ready for another round of “Do Good With Your Data?” Join us at TechNet Cyber in Baltimore this weekend! In the vein of our community partner for the year, HSSV, we’ve deviated a bit to help the veterans folks at TechNet support.

We believe your data should be used for something good, and want to encourage you to think about what your data is worth.

To help us with our mission, Tinfoil Security has partnered with Freedom Service Dogs of America for Technet 2019. FSDA provides assistance dogs for veterans returning from Iraq and Afghanistan. Each dog provides support as veterans re-enter civilian life, including companionship for disabled veterans. For every badge we scan we will make a donation to FSDA. The more scans we get the more money goes toward this incredible foundation and the more of an impact we can make for those who gave so much to protect us. Help us help veterans. 

Freedom Service Dogs of America Logo

Additionally, we’re excited to announce that our founders have both won the AFCEA 40 under 40 award this year! We are humbled and honored to be selected. We have a booth at TechNet Cyber in Baltimore this year, and would love to chat. Feel free to stop by booth 2143 for a conversation and a tinfoil hat. 

Bodyguard Foilz

TechNet Cyber - a way to connect government and military leaders with industry leaders - gives us the opportunity to further educate business and government leaders about the necessity of a security-first mindset. We believe that the internet and its users need more security, and we strive to make that goal a reality, one company at a time. Even if just a few companies or government agencies see the advantages of securing web applications and APIs in their DevOps pipeline, then our mission at TechNet Cyber was a success.  

Most post-breach plans by IT leaders have a common flaw: the plans are reactive,  rather than proactive. Security should be the first thing you develop a mindset for. A good guide to start with is the SANS Top 5, which especially if a security team is a new addition to your organization. 

A large number of breaches are the direct result of something easily preventable. This is the most frustrating aspect of most breaches: something easily preventable causing damage to the consumers and the companies. Human error is too high for the demanding nature of today’s SDLC; that’s where automated tools come in. A solid security team augmented with a great web application and API scanner works to ensure that you are doing everything they can to be secure. 

We are excited to attend TechNet Cyber 2019. We’ll be at booth 2143, so make sure you stop by and have a conversation about how our web application and API scanners can help your company be secure and efficient. If nothing else, stop by the booth to see if you can do more push-ups than Derek. ;)


Nicholas Bates

Nicholas is The Writing Writer, or Tinfoil Security's new Technical Content Writer. He has a background in network security where he worked as an engineer for 7 years before joining the Tinfoil team. As a father of his 3-year-old daughter, when he is not writing you can find him hiking with his family, or practicing Jiu-Jitsu.


RSA Conference 2019: Questions, Comments, Concerns?

What Just Happened?!

A little over a month ago, the Tinfoil Security team was out in full force at RSA. We met and talked to over a thousand of you. We were ecstatic about these conversations as what we have been working on fell in line with what developers, CISO’s and security engineers see as relevant and essential. The fact that security continues coming to the forefront is good for everyone as a whole. The more security awareness that is out there, the more the expectation for security will rise. With that, the companies that have poor security practices will lose. Consumer information will be protected with better and improving security practices and processes.  

The Takeaways

There were a few things that we believe stuck out the most at the conference. First, GDPR: what is going on with it, how do we comply, and do we need to comply? These were a few of the questions floating around regarding GDPR. There has been a new sub-industry created by GDPR. At this year’s RSA, there were many consulting, auditing or assessment companies explicitly playing in the GDPR “field.” We gained some insight into GDPR as far as it applies to companies moving forward. In a general sense, that there is a lack of a clear plan or outline for compliance to GDPR. California has come up with its own new privacy law that has companies concerned and confused, as well. The sentiment that I gained from this is that there will be a rather large learning curve when it comes to tech companies and GDPR compliance.

The second theme was the adoption of the idea of Security of DevOps or a security-first mindset when developing applications. This idea was pushed further into the light by our very own CEO, Ainsley, with her talk at RSA (“Building Security Processes for Developers and DevOps Teams”) which highlighted how the world of technology and development has changed. Briefly, the pace at which we push code is impossible to keep up with security-wise using old processes, technologies, and teams. Tackling this monumental task means building a multidisciplinary team to handle security and development for your applications. Her talk also focused on how to bridge the communication gap between operations, security, and development, breaking down the walls that currently exist.

The last hot topic was IoT and securing IoT devices. Few companies know how to build secure internet connected devices. As the adoption of IoT devices is ramps up, consumers are left less secure and have a wider attack surface. For the time being, consumers should consider being skeptical of IoT devices and take into consideration the increased level of vulnerability they are opening themselves up to. There should be a few fundamental considerations made by any company that wants to play in the IoT space. At a minimum, companies should find where their threats lie and build processes to reduce those threats. Releasing information to consumers around what they’re working on fixing and what the consumer is still vulnerable too helps as well. There are currently no laws we’re aware of that would force companies to disclose that information to consumers, but companies like Synology, 1Password, and Microsoft do an outstanding job of keeping their consumers up to date about what they're working on and what they have remediated.

In Summary

This years’ RSA Conference centered around a few main points that were reiterated throughout the entire conference. GDPR is here to stay, so how do companies cope with it? Security for DevOps was a major concern with them as well, with questions focused on how to get teams to work together, whether they are focused on development or security. Additionally, how can we automate some of the processes involved in developing secure applications? We can help with this point. Lastly, the overall impact of IoT was ever-present, with everyone asking questions around where security in the IoT space is going and how we believe it needs to get there. Did you find these same themes echoed at RSA? Are there things we should know or ideas we missed? Let me know



Nicholas Bates

Nicholas is The Writing Writer, or Tinfoil Security's new Technical Content Writer. He has a background in network security where he worked as an engineer for 7 years before joining the Tinfoil team. As a father of his 3-year-old daughter, when he is not writing you can find him hiking with his family, or practicing Jiu-Jitsu.

Tags: security DevSec DevOps


You've Got the Swagger, We've Got the SaaS

API security scanning changes today. Tinfoil Security is launching our new patent-pending API Scanner! After astounding feedback from developers leveraging our scanner and rigorous testing, we are proud to offer our scanner to the public. We are giving developers and companies the ability to scan and secure APIs with two different methods of deployment: on-premise and SaaS. We are very excited to invite you to see the API Scanner in action. 

As a clarification, we’re talking about web-based APIs such as REST APIs, web services, mobile-backend APIs, and the APIs that power IoT devices. We are not targeting lower-level APIs like libraries or application binary interfaces - a crucial distinction to make. The sorts of security vulnerabilities that affect web-based APIs are going to mirror the same categories of vulnerabilities we’ve spent the past eight years defending against with our web application scanner.

We’ve built a security scanner that understands how APIs work from the ground up, how they’re used, and most importantly, how they’re attacked. Existing solutions use a web application scanner to scan APIs, but that approach does not take into account the unique nature of APIs. Just as web applications can be vulnerable to issues like Cross-Site Scripting (XSS) or SQL injection, APIs can also fall prey to similar attacks. It isn’t quite that simple, though, and the nuances of how these vulnerabilities are detected and exploited can vary drastically between the two types of applications. In the case of XSS, for example, the difference between a vulnerable API and a secure API depends not only on the presence of attacker-controlled sinks in an HTTP response, but also on the content-types of the responses in question, how a client consumes those responses, and whether sufficient content-type sniffing mitigations have been enforced.

APIs also aren’t discoverable. Unless you’re one of the dozen companies in the world with a HATEOAS-based API, it isn’t possible for a security scanner to load up your API, follow all of the links, and automatically discover all of the endpoints in that API. The parameters expected by those endpoints, and any constraints required by them, are even harder to discover. Without some way of programmatically acquiring this information, API security scanning simply can’t be automated in the same way that web scanning has been.

To deal with these discoverability issues in APIs, we looked at standards like Swagger, RAML, and API Blueprint. We’ve found that Swagger (now known as the OpenAPI Specification), in particular, is winning out as the standard for API documentation. In response, we’ve designed our API scanner to ingest Swagger documents and use them to build a map of an API for scanning. Doing so solves the issue of being unable to crawl an API. It also allows us to scan APIs with a higher level of intelligence than black-box dynamic web application scanning has ever been able to.

We have addressed authentication issues using something we call authenticators. We’ve distilled API authentication down to its foundations; whether that’s as simple as adding a header or parameter to a request, or as complex as performing an entire OAuth2 handshake and storing the received bearer token for later use. We’re then able to chain together all of the authenticators, incrementally transforming unauthenticated requests into authenticated ones. Having such a nuanced understanding of all the steps of an authentication workflow lets us detect when any of those steps have failed, and when the server isn't honoring any of them. This allows us to fuzz the individual steps of an authentication flow, providing a powerful tool for determining authorization and authentication bypasses.

Some features that will get any developer excited: 

  • Intelligent payload generation
  • A powerful REST API to control the scanner and its reports
  • First-class support for API authentication workflows. 

What we mean by intelligent payload generation is parameter fuzzing that takes into account the schemas of the parameters, e.g. types of parameters, constraints, whether the parameter is required, and valid inhabitations to name a few.

In addition to vulnerability scanning, our scanner also performs correctness checking and looks for bugs to reduce an API’s attack surface. With full integration into your existing CI/CD pipeline, we create and track issues and vulnerabilities easily, allowing your teams to focus on what’s most important to your organization.

You can, for the first time, secure your APIs with a scanner that was built specifically for APIs. We’re not just pointing our web application scanner at your API and calling it good. Our API scanning is intelligent and thorough. By using your API’s specification as an outline, we focus on security vulnerabilities as they manifest themselves within APIs. This means fewer false positives, a higher degree of coverage, and a better understanding of the risk posture posed by your APIs.

There is every reason to give our unrivaled API Scanner a test drive, and you can do it right now. For a more detailed feature breakdown, please refer to this post



Nicholas Bates

Nicholas is The Writing Writer, or Tinfoil Security's new Technical Content Writer. He has a background in network security where he worked as an engineer for 7 years before joining the Tinfoil team. As a father of his 3-year-old daughter, when he is not writing you can find him hiking with his family, or practicing Jiu-Jitsu.

Tags: DevOps DevSec Free Scan Launch