Shifting to an Integrated Security Approach
By- November 22, 2019
At Tinfoil, we’re going to spend this month taking time to care about one of the most important things - you! That’s right, this month is our month of empowering customers so they’re successful. We’ll be writing some blog posts and posting some videos to help make your security for DevOps implementation easier. If you have any questions you want answered about anything security or SecDevOps related (or kitten related, because we like those, too!), email email@example.com.
Our first post is a question we see a lot:
I get a lot of push back from developers on having to worry about security. They produce a lot of our security bugs. How can I fix this?
This is a great question! Security isn’t easy, otherwise everybody would be secure. To implement security across your development teams, you need to identify the root of the problem. Do your developers feel held back from building? Does it feel like there’s so much added process that it blocks development? Is there a barrier of intimidation to be tackled to really empower your developers? We’ve seen each of these cases, but understanding what’s happening in your organization makes all the difference.
The most common objection we hear from developers stems from one fact: security is hard… really hard! Oftentimes, this difficulty intimidates engineers. Perhaps there’s a fear of job security, or maybe the tools you’re providing your developers are too difficult to understand. Fear is most easily assuaged by good education. Turn security into a fun competition and you’ll often see developers come out of their shells, challenging themselves and you. See how your current security tools can integrate into the development process, be simplified, and be more easily understood. Do your tools provide a 97-page PDF? That’s probably not going to work for your development teams.
Sometimes, developers will push back on implementing security because it takes too much time. If the new process you’ve given your development team is blocking your engineers from creating something new, it’s best to add security as a process without making your developers ever _feel_ like there’s a new process. Hooking into your CI system so vulnerabilities are found with the existing testing process lets security get tested with all of their existing unit, regression, and integration tests. Inserting a vulnerability into their issue tracker as a normal bug allows your developers to start looking at it in the same place as all of their other features and bugs. Bonus points if your security tools provide remediation instructions for each vulnerability! Security will naturally become less intimidating, and your developers will learn to consistently produce more secure code.
The least common objection we see is complacency: “security doesn’t matter to me.” This is mostly an educational gap that needs to be worked through. In this rare case, understanding your team dynamic and what drives each team lead and member is crucial. Sometimes, gamifying security piques the team’s interests. Sometimes, the drive to learn, paired with educational classes on how vulnerabilities on the development team have been built in the past can help. Bonus and reward structures might help a more stubborn team.
Shifting from a traditional cybersecurity approach to an integrated security approach is tough. It takes management buy-in and a lot of hard work, but it pays off in the long run. You know your team better than we do, and each team can be pushed in many different ways, and we’re happy to help. Let us know how we can!
This post is from our month of empowering customers for success series. Have a question you want answered about anything security or SecDevOps related (or kitten related, because we like those, too!)? Email firstname.lastname@example.org.
Wormwood - An Explicit Way to Test Absinthe GraphQL APIs
By- October 23, 2019
We love GraphQL at Tinfoil! We use it extensively in our Elixir and Phoenix powered API scanner. We try to test out code using ExUnit whenever possible to help ensure a stable and smooth development cycle. Testing an Absinthe GraphQL API usually follows the pattern of: Setting up a ConnCase, making the request, and then validating that the data returned from the request was valid. A lot like the following:
This works well! But as a result we may accumulate a whole lot of boilerplate code for setting up these Plug Conns and parsing their results. Not to mention the frequent module attributes and strings used to contain our GraphQL queries inside the unit test module itself. We figured there’s probably a better way to write these sorts of tests that can leverage the power of the Absinthe library itself, rather than sending HTTP requests during a test run.
After some experimentation we created Wormwood, a small and open source Elixir library to assist with Absinthe GraphQL document testing in ExUnit. We can eliminate large chunks of Plug.Conn boilerplate, remove static strings of query code, explicitly scope a module to a single query document and schema, and call our GraphQL API like this instead:
In the above snippet, Wormwood is loading, parsing and validating the query document at compile time, then, it runs that loaded query against the specified schema using Absinthe itself. With this method of GraphQL testing, it’s super explicit which query document we are testing, it’s also clear which schema we are executing it against. We also gain the benefit to test against the errors that Absinthe can return at the different phases of the pipeline, and even control the pipeline itself! (More on that later.)
Using Wormwood in Your Own App
We’ll break down each of the steps for utilizing Wormwood in your own testing setup. You can get (and contribute to!) Wormwood on our GitHub! Wormwood is also available on the Hex package repository. Some of the example code shown in this post is available in the repo itself.
The first requirement for using Wormwood is to break out your queries into individual GQL files. While this may sound excessive, it offers a lot of power in terms of code coverage, and project organization. You can read more about using individual GraphQL files with Webpack in the Apollo docs, it’s pretty simple to set up. Once you have all your documents broken out into files, we can select the ones we want to test, and make accompanying ExUnit test modules for them.
Let’s say we have this GQL document:
And our imported fragment is just a simple set of reusable attributes, it looks like this:
We can then create a basic ExUnit test module we’ll call “get_users_test.exs”, and we’ll load our schema and the GQL document into the module using the “load_gql/2” macro.
Let’s break down what Wormwood is doing here:
When “load_gql/2” is executed it attaches two special attributes to the module it was called from, they are assigned the Absinthe schema from the first argument, and the full source text of the query file from the second argument. Wormwood will expand all import statements it can find, and will raise an exception if it cannot find a file or if it could not validate the syntax of the full query with Absinthe.
Now that our module has a schema and document assigned, we can query it using the simple “query_gql/1” function:
Our query results now live in the “query_data” variable. If we inspect it, we can see that our results are returned in vanilla Elixir lists and maps:
If you have a deeply nested structure, a good tip is to use the Elixir Kernel function “get_in\2”, which takes the structure you want to access data from, and a list of keys or access functions to retrieve specific members. For example, if I wanted to fetch the id of the first user in this big query result I could simply do the following:
More Advanced Queries
The above example is just a simple demo of how to use Wormwood in your testing suite. Wormwood supports a few more options and configurations when testing. Below is a quick list of the features, along with snippets to show how it’s done. You can also dig around in the examples folder on the GitHub repo.
Running a Query With Variables and Context
You can pass the same options keyword list you would pass to “Absinthe.run/3” into the “query_gql/1” function. Refer to the Absinthe docs on the exact options, and their usage. If we wanted to pass a variable into this query we can just leverage the options Absinthe provides:
The same can be done for context if you want to fake something like authentication:
Running a raw string, rather than a GQL file
Of course if you don’t want to break out your GQL documents into files, you can still assign them to a test module as a raw string. Rather than calling “load_gql/2”, call “set_gql/2”:
Wormwood will still expand import statements when using raw strings! They will be relative to the current working directory, which is usually your app root directory.
Running a Query With a Custom Absinthe Pipeline
If at any point you wish to modify the pipeline that Absinthe uses for executing a document loaded into a module, you can do so by composing a list of pipeline Phases, and passing them into the “query_gql_with_pipeline/2” function like so:
Wormwood was born out of specific quirks we ran into while testing our API Scanner, we hope you find it useful as well! It aims to help accelerate and improve the way GraphQL tests are written within ExUnit. If you have stars, issues or contributions, feel free to leave any of them on the official GitHub repo!
Announcing GraphQL Security Scanning
By- October 15, 2019
For the second time this year: API security scanning changes today. We’ve been working hard on adding support to scan GraphQL APIs for security vulnerabilities, best practices, and correctness. Earlier this year, the Tinfoil Security API Scanner initially launched with support for the Swagger documentation format, and we’re excited to expand coverage to now include GraphQL APIs. To be clear, we’re not deprecating support for OpenAPI scanning - in fact, OpenAPI specification v3 support is coming soon! We’ve enjoyed building our own GraphQL APIs to power our user interfaces and we felt the need to ensure their correctness as we built them. To that end we’ve added first-class GraphQL support to our API Scanner. We’re thrilled today to announce the beta of our GraphQL scanning capabilities at the GraphQL Summit conference in San Francisco.
We use GraphQL internally to iterate quickly with our user interfaces without huge changes to the backend server each time. (We’re hiring, if Elixir, GraphQL, and Vue are interesting to you, by the way). GraphQL makes it easy to decouple user interface needs from a backend API server by offering a buffet of data and relationships without restricting the format to a specific JSON payload. Nowadays UI developers can iterate quickly, but this puts extra load on API server engineers to make a performant, and most importantly safe, GraphQL API.
One huge advantage of GraphQL APIs is that they are self-documenting. Most GraphQL APIs can be introspected to pull out the types, fields, and mutations. This can make it a joy to work with a tool like GraphiQL to explore an API, but also makes it very easy to get started scanning. All you need to do is provide the GraphQL endpoint and the Tinfoil Security API scanner will do the rest. We automatically discover the different types, fields, arguments, and mutations exposed by your API, and generate an optimized set of documents to exercise all of the different aspects of your API.
In addition to searching for various injections (both direct and blind), we also look for GraphQL-specific concerns. One such concern is cycles in the query graph, potentially DoSing an API server with a request that is time consuming to fulfill. GraphQL allows you to set complexity limits on documents received from the client to help prevent this, and our scanner makes sure your API server has a reasonable complexity limit set. When not auditing high-complexity queries, we make sure the documents we generate balance simplicity with API coverage.
Our support for GraphQL is only beginning; please stay tuned for more developments! If you’re interested in joining the beta of our GraphQL Scanning, please drop us a line.
TechNet Cyber 2019: Here we come!
By- May 14, 2019
Ready for another round of “Do Good With Your Data?” Join us at TechNet Cyber in Baltimore this weekend! In the vein of our community partner for the year, HSSV, we’ve deviated a bit to help the veterans folks at TechNet support.
We believe your data should be used for something good, and want to encourage you to think about what your data is worth.
To help us with our mission, Tinfoil Security has partnered with Freedom Service Dogs of America for Technet 2019. FSDA provides assistance dogs for veterans returning from Iraq and Afghanistan. Each dog provides support as veterans re-enter civilian life, including companionship for disabled veterans. For every badge we scan we will make a donation to FSDA. The more scans we get the more money goes toward this incredible foundation and the more of an impact we can make for those who gave so much to protect us. Help us help veterans.
Additionally, we’re excited to announce that our founders have both won the AFCEA 40 under 40 award this year! We are humbled and honored to be selected. We have a booth at TechNet Cyber in Baltimore this year, and would love to chat. Feel free to stop by booth 2143 for a conversation and a tinfoil hat.
TechNet Cyber - a way to connect government and military leaders with industry leaders - gives us the opportunity to further educate business and government leaders about the necessity of a security-first mindset. We believe that the internet and its users need more security, and we strive to make that goal a reality, one company at a time. Even if just a few companies or government agencies see the advantages of securing web applications and APIs in their DevOps pipeline, then our mission at TechNet Cyber was a success.
Most post-breach plans by IT leaders have a common flaw: the plans are reactive, rather than proactive. Security should be the first thing you develop a mindset for. A good guide to start with is the SANS Top 5, which especially if a security team is a new addition to your organization.
A large number of breaches are the direct result of something easily preventable. This is the most frustrating aspect of most breaches: something easily preventable causing damage to the consumers and the companies. Human error is too high for the demanding nature of today’s SDLC; that’s where automated tools come in. A solid security team augmented with a great web application and API scanner works to ensure that you are doing everything they can to be secure.
We are excited to attend TechNet Cyber 2019. We’ll be at booth 2143, so make sure you stop by and have a conversation about how our web application and API scanners can help your company be secure and efficient. If nothing else, stop by the booth to see if you can do more push-ups than Derek. ;)
RSA Conference 2019: Questions, Comments, Concerns?
By- April 26, 2019
What Just Happened?!
A little over a month ago, the Tinfoil Security team was out in full force at RSA. We met and talked to over a thousand of you. We were ecstatic about these conversations as what we have been working on fell in line with what developers, CISO’s and security engineers see as relevant and essential. The fact that security continues coming to the forefront is good for everyone as a whole. The more security awareness that is out there, the more the expectation for security will rise. With that, the companies that have poor security practices will lose. Consumer information will be protected with better and improving security practices and processes.
There were a few things that we believe stuck out the most at the conference. First, GDPR: what is going on with it, how do we comply, and do we need to comply? These were a few of the questions floating around regarding GDPR. There has been a new sub-industry created by GDPR. At this year’s RSA, there were many consulting, auditing or assessment companies explicitly playing in the GDPR “field.” We gained some insight into GDPR as far as it applies to companies moving forward. In a general sense, that there is a lack of a clear plan or outline for compliance to GDPR. California has come up with its own new privacy law that has companies concerned and confused, as well. The sentiment that I gained from this is that there will be a rather large learning curve when it comes to tech companies and GDPR compliance.
The second theme was the adoption of the idea of Security of DevOps or a security-first mindset when developing applications. This idea was pushed further into the light by our very own CEO, Ainsley, with her talk at RSA (“Building Security Processes for Developers and DevOps Teams”) which highlighted how the world of technology and development has changed. Briefly, the pace at which we push code is impossible to keep up with security-wise using old processes, technologies, and teams. Tackling this monumental task means building a multidisciplinary team to handle security and development for your applications. Her talk also focused on how to bridge the communication gap between operations, security, and development, breaking down the walls that currently exist.
The last hot topic was IoT and securing IoT devices. Few companies know how to build secure internet connected devices. As the adoption of IoT devices is ramps up, consumers are left less secure and have a wider attack surface. For the time being, consumers should consider being skeptical of IoT devices and take into consideration the increased level of vulnerability they are opening themselves up to. There should be a few fundamental considerations made by any company that wants to play in the IoT space. At a minimum, companies should find where their threats lie and build processes to reduce those threats. Releasing information to consumers around what they’re working on fixing and what the consumer is still vulnerable too helps as well. There are currently no laws we’re aware of that would force companies to disclose that information to consumers, but companies like Synology, 1Password, and Microsoft do an outstanding job of keeping their consumers up to date about what they're working on and what they have remediated.
This years’ RSA Conference centered around a few main points that were reiterated throughout the entire conference. GDPR is here to stay, so how do companies cope with it? Security for DevOps was a major concern with them as well, with questions focused on how to get teams to work together, whether they are focused on development or security. Additionally, how can we automate some of the processes involved in developing secure applications? We can help with this point. Lastly, the overall impact of IoT was ever-present, with everyone asking questions around where security in the IoT space is going and how we believe it needs to get there. Did you find these same themes echoed at RSA? Are there things we should know or ideas we missed? Let me know!