Web Security: Attack, Defend, and Profit Q&A

MODERATOR: We’ve got some
burning questions right there. Let’s do it. AUDIENCE: Hello. So next Tuesday
there is going to be a hearing between the Justice
Department and the EFF on whether or not
courts can request user data from companies. That in response has
created a lot of talk about client side
crypto, and there’s a working group in the
HTML5 about web crypto API. I was wondering if you guys
had any sort of opinions on web crypto API
and whether or not that was a viable response
to that kind of attack. I know that ProtonMail
just released their beta a week and a half ago. And I was wondering, since
that is so slow currently because it’s JavaScript
crypto, whether or not web crypto API would
contribute to that at all? EDUARDO VELA NAVA: So I think
web crypto is super useful. Generally speaking,
it has problems in the sense that it’s
a native crypto library, so it’s likely that
people that want to implement their
own crypto are going to fail at doing that. But if they use of crypto
system that is well known, it is most likely that
it’s going to be safe. I’m not sure if I
answered the question because it seemed like it was
whether web crypto was cool. I like it. AUDIENCE: My question. My name is [INAUDIBLE]. My question is more general. It’s related to the whole
philosophy we learned today about how important security
is, and to submit security vulnerabilities,
and report the bugs. I’m wondering, Google
has this program where they offer rewards. What about if in the application
there exists a vulnerability and it’s happening
from Google, possibly it’s happening for other
large corporations. Maybe some of those
who don’t have a bounty program, how can we, as
good citizens I guess, report something and
not go to jail for that. [LAUGHTER] TIM WILLIS: So to
answer your question about can you submit
bugs to other programs. Yes you can. There are actually
programs which are set up, like the
Internet Bug Bounty, which is a program which allows you
to submit bugs in products that might not have a
vulnerability reward program. And they will actually sponsor
some of those payments as well. So it’s possible for
you to submit a bug through a program like
that, and then you’ll also receive a payment for that. That being said though,
with the wider issue, I think that with the
vulnerability reward program that we run,
and the different types of vulnerability
reward programs we run, you’ll probably see that we
will continue to invest in that. And we’ll probably end up
looking at some more client side applications
in the near future. AUDIENCE: So if some company
doesn’t have a reward system, you can– some other website? Where do you go
for that you said? TIM WILLIS: So one
example of that. There are companies which will
conglomerate bug reporting for you and take care
of the relationship between the company
and the disclosure and keep you at
an arm’s distance. One example of that is
the Internet Bug Bounty. There are other examples. I think Bug Crowd is another
one that comes to mind. But you can always–
usually security at for the big companies is
a good place to start. And if you present yourself
in the right light, I can’t imagine you’re going
to run into many problems. AUDIENCE: And can you
just blog about it if you find something
on your own? TIM WILLIS: Well that’s
a personal decision. There’s the debate between
full disclosure, coordinated disclosure, responsible
disclosure, whatever you want to call it. It’s really up to you how you
want to handle that situation. That being said,
there usually are avenues that companies
will be willing to accept those type of bugs. AUDIENCE: Thanks. MODERATOR: Yeah, on company
just got funded for $9 million to help solve that
problem at TechCrunch. What do you see as fundamental
computing security problems? We have the whole arms
race back and forth. What are some things that
are fundamentally technology, structure, approaches,
even language syntax, or architecture in clouds, there
are problems that fundamentally are bigger than what
one company can fix that people should
be looking at? Like OpenStack’s
doing and other stuff. TIM WILLIS: They were just
a lot of scary words to me. [LAUGHTER] To be honest, I think,
at the moment at Google, we’re focusing on the
products that we build and how we can do that, unless
someone else has comments specifically on the wider
issue, I’ll hand it to Joel. JOEL WEINBERGER: I think
fundamentally we’ll always have bad
guys who are trying to break whatever we build. So in some sense, it’ll
always be an arms race. But that doesn’t
mean we’re losing. And in fact, from
my perspective, we’ve done a pretty good
job over the last 10 years. If you compare web security and
tools available to us from 10, 15 years ago, it’s just
a world of difference. Even if you compare web
application security more generally to traditional
application security and OS level security, we’ve
done some pretty amazing things with the fact that
one website can’t mess with another website. That’s a pretty big
fundamental win. So I do think there will
always be an arms race. I’m not sure we can
really avoid that. Maybe there’s somebody
who’s got a great idea about how to fix that. But I think there are
leaps and bounds we take. And I actually think we’re
doing a pretty good job in web security. MODERATOR: Why is
it that when we used to have secure computing in
projects and other frameworks, the actual break ins
we have now cascade because it’s so [INAUDIBLE]. We don’t have tools to
keep that from happening with layered security. We have hits that are now 200
million users, or open all off, or SSL. These are things where the tools
we all use now affect everybody in one hit, like
Heartbleed hits everybody. That didn’t used to
happen as much, even though the tools were
easy to build with. JOEL WEINBERGER:
Is that– I’m not– AUDIENCE: Is that
a statement of a– AUDIENCE: That’s a statement
and question though, because actually
it’s getting harder secure larger surface areas. JOEL WEINBERGER: Maybe. I mean a Windows bug was
always pretty darn bad when it affected
95% of everybody. Worms in the early
2000s instantaneously affected most of the world. So I’m not actually sure
that’s a true statement. There are certainly problems. Those are all valid issues. And I think it’s an
open question how we’re going to address that. I think it’s a great question. EDUARDO VELA NAVA: So
just to mention on that. One thing that might
be worth mentioning is that the way the
DNS works in general is that they try to
diversify as much as possible implementations on
operating systems and so on. So if there is a
vulnerability in one system, it doesn’t bring down
the whole internet, and maybe keeping up on
that idea might be worth, but they have other problems
that aren’t related anyway. AUDIENCE: OK, so this is
mainly based toward Eduardo and you’re talk. My question to you or how do
you think this should be solved is when you put the
responsibility to security to somebody else– in your case
it’s you, or you at Google, or whoever of some company,
or some other developer submitting patches. If you put it all out
of your own control, what stops those other
patches or a bad patch to be put into a large system? For example, a very recent
famous example is, let’s say, the Heartbleed bug. Everybody’s relying on
this SSL or everybody’s relying on one specific source. What securities or safeguards
do you have against, let’s say, a company who doesn’t–
oh let’s patch it. It looks like it
works, and it fixes, but it’s actually
and evil patch. And the company
mistakenly accepts that and now puts
everybody in danger. What safeguards do
you foresee that will prevent those kind of things? EDUARDO VELA NAVA: So it’s
pretty similar to what I just mentioned right now, but one
of the reasons that we’re not just sponsoring one
framework, for example, is because we want to
make sure that this is– that people have options. So for example, in one case, if
there were several framers that have roughly the same
qualities or values and people can choose
and compete within them, like you can choose
which one we want to use, then you’re most likely–
if there are two, then half of the
internet will be broken. If there are 20, then 1/20 of
the internet will be broken. And that’s a problem
with all support. Like if there is a
common component, if anyone trusts it and it’s
going to affect lot of people. One other thing that I mentioned
at the end, which might not affect– might not makes
sense for open SSL, for example, as you
mentioned with Heartbleed, but it makes sense for
other common components, like FFMPG or things like
that, which is sandboxing. So if you sandbox
things that have very specific set of
constraints that only need to do a specific
subset of [INAUDIBLE], then we’re now
witnessing that system, I don’t think is going to
have as big of an impact as it would have otherwise. AUDIENCE: And then just the
one point to query on that. The sandbox itself,
would that be in any way vulnerable to, say,
somebody bugging the sandbox or making it not sandbox. Is that possible? EDUARDO VELA NAVA: That will
be a feature of the operating system or of the platform. So for example, in BSD you
have just [INAUDIBLE] jails. Or like in Linux, you
will have the seccomp or whatever sandbox. Windows has their own. Mac OS has their own. In the web frame where
we have appropriately named iframe
sandboxes and CSP that can be somewhat useful
sandboxing as well. So it’s a feature
of the framework. The [INAUDIBLE] is broken, then
the internet is broken as well. But then you have several
browsers hopefully. As long as there is no
complete diversification, there will always be some
sort of one thing that can break the internet. Open is always a great example. But there is also
NSS, for example. And many people didn’t
break because they didn’t used OpenSSL
they used NSS. And that doesn’t mean that
NSS is not vulnerable. Maybe NSS is vulnerable
to even something worse, but then people on OpenSSL
will not be vulnerable. That’s a problem with
common components. PARISA TABRIZ: I
just want to say one thing is one of the
fundamental security design principles is defense
in depth, right? So at any point, if you’re
a software is dependent– the total security is
dependent on one bug, then that is something
to worry about. And that’s why you want
multiple layers of defense. As developers, we
rely on abstraction. You will not be able
to understand firsthand every single layer of
the software you’re trying to build, otherwise you
just can’t get anything done. So you’re right. When you are abstracting, are
dependent on somebody else to actually supply code and
behave as you expect it to. There is some risk in that
there may be a but there. If not today, then
in the future, which is why it’s so important
to architecture software such that you you’re
just reliant on one bug. AUDIENCE: Thank very much. AUDIENCE: Hi, thanks
for doing this. Question about testing. Are there any
automated ways to test for something that’s
very common like CSRF? If 75% of what you’re
seeing is CSRF out there, are there any frameworks
or third party services that you can use? And if not, are
there any challenges to developing such a service? EDUARDO VELA NAVA: So
for CSRF in a specific, I think it mostly depends–
if you’re using– well this is HTML5, so
in this case, you might use the traditional
model of just having a form and then the form submitting
things automatically, which means that traditional
scanners– there are scanners that find CSRF, but
they might not find them on very client side
rich applications. I think, at the end of
the day in this case, you need to either depend on
the framework to do it safely, or if you want to
really do it by testing, then you need to either
do something specifically to your framework or use a tool. There is one tool
called Ratproxy, for example, that one of
my colleagues released, which analyzes
requests passively. So you create as a proxy,
and then it makes requests, and then it analyzes requests. And then in some cases,
it repeats requests that it thinks are state
changing without the CSRF token in my find box. But it’s not fool
proof, of course. With testing you mostly are
trying to find these bugs, but you’re not sure that
they’re not going to happen. That may help. Anyone else? AUDIENCE: Hi. Thank you for this
nice presentation. My question is if HTTP is
very vulnerable protocol, why not get rid of
it, or disallow it? JOEL WEINBERGER: Oh,
you’re my best friend. Oh, you’re my best friend. I would love to do that. No, I mean the answer
is, as a security guy, my goal is not to
break the world, right? As much as I’d
like to sometimes, it’s really important that
the world keep working. And while it would be great
to live in a world in which we only have HTTPS, that’s a
world we’re working towards. There are things
we do to encourage people to go to HTTPS. We have these green locks,
which look really nice and are really pretty,
and they actually make your website look better. And that’s great. And there’s a reason we do that. And we want you users to
know that they’re secure. And when they’re not over HTTPS,
we want them to not have that feeling of security,
because they shouldn’t. But if any of the
browsers, for example, were to just take
away HTTP, nobody would use that web browser, or
at least a very small number would. Well, no because the
users would know. And users would know
because they wouldn’t be able to get to The New
York Times, for example. Just as a random example. So our job as
software engineers who focus on security, or
as security engineers, is to make the
world a place where HTTPS is a place you want
to be as web developers, where you want to be as users. And if we’re not
doing that, then we’ve done something wrong. So we’re trying really
hard to get there, and we’re trying
harder and harder. And we’re trying to
move towards that. But it’s not something
we could just jump into tomorrow
as much I’d like to. Although there are
ways you can do that. You can get extensions
for Firefox or Chrome which won’t let you
visit non HTTPS pages. I do recommend that. AUDIENCE: Yeah. But one more comment is for non
developer or technology expert, they even distinguish
between HTTP and HTTPS. And one of my clients asked
me, Google+ send me email that my account hacked. Because I ask her what
type of password you have. She said let me in. So I told her no. That’s not the
password you want. You want to have capital letter,
and one– at least one capital letter, one digit, and
one special character, and at least eight,
blah, blah, blah. JOEL WEINBERGER:
I’m going to focus on the HTTPS and
users not knowing what that is, and
so on so forth. That’s a totally valid point. And moreover I would make
a stronger statement, which is we don’t
even necessarily want users to know what HTTPS is. We want them to know
secure versus insecure, but we don’t want to
know what is HTTP even. AUDIENCE: They don’t even care
about secure and not secure. JOEL WEINBERGER: So the– AUDIENCE: I’m sorry. JOEL WEINBERGER: The claim
is that users don’t even know secure from insecure,
and that’s possible too. So there’s two answers. One is that we actually do
find differences depending on, for example, the user
interfaces that we have. We do find differences in how
users interact with web pages. So one answer is that, as a
browser vendor in particular, we try to find ways of
expressing or encouraging users to use more secure web
applications through our user interface design. The other answer is,
through same tools, we also want to encourage web
developers to choose HTTPS, because that’s
ultimately how we’re going to solve this problem. And so we try to come up
with ways of making sure that our developers have HTTPS
is the better option for them. And that’s something
they’ll want to choose. And to make it easier. Part of that’s outreach. Not a coincidence
we’re all here tonight trying to discuss these issues. But others is sort of
implementation as well. But the really short answer
is it’s really, really tough. It’s really hard to
make this happen. But we’re trying. AUDIENCE: Thank you. JOEL WEINBERGER: And
you guys can all help. Make your website
secure, first of all. And tell all your friends,
and so on and so forth. MODERATOR: Final two questions. EDUARDO VELA NAVA: Just one
more thing on that last thing. There are some
features of HTML5 that are being released
only for HTTPS sites. So that also works
as a carrot to help. AUDIENCE: What about HTTP 2.0. It’s already here. Does is it supporting
any security related? Has any security related fixes? Or I don’t know. EDUARDO VELA NAVA:
So I’m actually not sure who ended up on top. But SPDY for example, Chrome
only supports SPDY over SSL, even though it could support
it over non encrypted channels. And that was it’s
similar, right? Just like carrot. But I think SPDY is
possible to run without SSL. It’s just Chrome
doesn’t implement it that way, for security
and blah, blah, blah. And the betas still being had. But that is the
current implementation and how it has been implemented. AUDIENCE: My main question
was, it’s more philosophical. Like today I learned
a lot about security, and I kind of underestimated it. And I know a lot
of companies do. They don’t have a
dedicated security person, until something bad happens. Who should be the one
who should be an expert or know how to write correctly,
or write better code, that has security
checks and balances. Should it be developers? Or should it fall
under QA who should be testing for these
kind of attacks? EDUARDO VELA NAVA: So I think
it should be the framework developer,s and you should just
choose the right frameworks. But if you want to go the way
of teaching all the developers, the it should be all the
developers, and all QA testers, and all the everyone, because
all of them are humans. PARISA TABRIZ: I can
just say one thing for at Google it is the
responsibility of everybody who’s working on a product
to care about the security just as similar as to
care about the usability and, in general, the
quality of the product. Security is just one aspect of
the quality of the software. And I don’t think
you can actually delegate the responsibility
to any single job role. So all software engineers
are responsible for security. All product managers. All program managers. We have people that
work PR and legal, and they need to care
about security too. So I really don’t
think it’s something that you can delegate
to one person, because it’s quality of
software that everybody needs to care about. And that’s how we
make it scale and work across lots of software
in a really big company. AUDIENCE: Last
question for Eduardo. You said ask me in the
question, why was I sorry about? What is that? EDUARDO VELA NAVA: Because
I used Prezi, and Prezi is in a Flash application,
and I used it in HTML5. [LAUGHTER] That’s why I said sorry. AUDIENCE: OK. Last question. On June 5, there
is a project called Reset the Net that is
asking web developers to do one thing to improve
web security. I was wondering if I could
get an answer from each of the panel on–
obviously your answer is HTTPS– what are the
other three panelists opinions on what is
the one thing that you could do to improve
web security. And part of this
movement is to, whenever you do that thing on
June 5, write about it. Publish it, tweet
it, do anything you can to talk
about web security. JOEL WEINBERGER: I’m pretty
sad you didn’t give me a chance to answer that, but
yeah, HTTPS, yes. EDUARDO VELA NAVA: I talked
about framework security and blah, blah, blah. So just make a patch,
or send an email to your favorite
framework and tell them about what you want them to do. Or do it yourself after
talking with them. PARISA TABRIZ: So I’m
also an SSL evangelist. But since you’re already
going to do these things– you have to got last– [LAUGHTER] I guess mine would
be, if you’re going to use existing
frameworks, or components, or whatever, make
sure they’re updated. A lot of the times
vulnerabilities are disclosed, so people
know about the bugs and how to exploit
them, but you actually don’t get the fixes unless
you update your software. So use SSL. Use existing frameworks. And make sure it’s all updated. TIM WILLIS: Don’t worry. I’m used to going last. My last name is WIllis. So I was always
last, until the kids whose last name started
with A complained. And they’re like, all right. For this really
annoying task, we’ll go reverse alphabetical order. Willis you’re up. For me it would be, if you
can’t sell them on HTTPS and you can’t sell
them on the frameworks, send them the XSS Wargame. Please, please. XSS is a huge problem. And the one thing I would do
is, if you can’t sell them on the frameworks and you
can’t sell them on HTTPS, please use the links
we provided tonight and get them educated
and interested in XSS. Because if we could
squash that, we would be in an amazing position. AUDIENCE: [INAUDIBLE]. MODERATOR: Just go for it. That’s always fine. Thank you. AUDIENCE: Spread the word. MODERATOR: Spread the word. All right. Big round of applause
for all the speakers. [APPLAUSE] We are going to
have the raffle now, and we’re going to sign
off on the live stream. Everyone, hundreds of
people on the live stream, thanks for joining us. We are going to go through
the raffle tickets. Actually first off– [MUSIC PLAYING]

, , , , , , , , , , , , , , , ,

Post navigation

One thought on “Web Security: Attack, Defend, and Profit Q&A

  1. Would be nice if you could put the links of thinks talked about (like the xxs training website) in the description.
    Interesting discussion!

Leave a Reply

Your email address will not be published. Required fields are marked *