Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 1 | # Chrome Security FAQ |
| 2 | |
| 3 | [TOC] |
| 4 | |
| 5 | <a name="TOC-Why-are-security-bugs-hidden-in-the-Chromium-issue-tracker-"></a> |
| 6 | ## Why are security bugs hidden in the Chromium issue tracker? |
| 7 | |
| 8 | We must balance a commitment to openness with a commitment to avoiding |
| 9 | unnecessary risk for users of widely-used open source libraries. |
| 10 | |
| 11 | <a name="TOC-Can-you-please-un-hide-old-security-bugs-"></a> |
| 12 | ## Can you please un-hide old security bugs? |
| 13 | |
| 14 | Our goal is to open security bugs to the public once the bug is fixed and the |
| 15 | fix has been shipped to a majority of users. However, many vulnerabilities |
| 16 | affect products besides Chromium, and we don’t want to put users of those |
| 17 | products unnecessarily at risk by opening the bug before fixes for the other |
| 18 | affected products have shipped. |
| 19 | |
| 20 | Therefore, we make all security bugs public within approximately 14 weeks of the |
| 21 | fix landing in the Chromium repository. The exception to this is in the event of |
| 22 | the bug reporter or some other responsible party explicitly requesting anonymity |
| 23 | or protection against disclosing other particularly sensitive data included in |
| 24 | the vulnerability report (e.g. username and password pairs). |
| 25 | |
| 26 | <a name="TOC-Can-I-get-advance-notice-about-security-bugs-"></a> |
| 27 | ## Can I get advance notice about security bugs? |
| 28 | |
| 29 | Vendors of products based on Chromium, distributors of operating systems that |
| 30 | bundle Chromium, and individuals and organizations that significantly contribute |
| 31 | to fixing security bugs can be added to a list for earlier access to these bugs. |
| 32 | You can email us at [email protected] to request to join the list if you |
| 33 | meet the above criteria. In particular, vendors of anti-malware, IDS/IPS, |
| 34 | vulnerability risk assessment, and similar products or services do not meet this |
| 35 | bar. |
| 36 | |
| 37 | Please note that the safest version of Chrome/Chromium is always the latest |
| 38 | stable version — there is no good reason to wait to upgrade, so enterprise |
| 39 | deployments should always track the latest stable release. When you do this, |
| 40 | there is no need to further assess the risk of Chromium vulnerabilities: we |
| 41 | strive to fix vulnerabilities quickly and release often. |
| 42 | |
| 43 | <a name="TOC-Can-I-see-these-security-bugs-so-that-I-can-back-port-the-fixes-to-my-downstream-project-"></a> |
| 44 | ## Can I see these security bugs so that I can back-port the fixes to my downstream project? |
| 45 | |
| 46 | Many developers of other projects use V8, Chromium, and sub-components of |
| 47 | Chromium in their own projects. This is great! We are glad that Chromium and V8 |
| 48 | suit your needs. |
| 49 | |
| 50 | We want to open up fixed security bugs (as described in the previous answer), |
| 51 | and will generally give downstream developers access sooner. **However, please |
| 52 | be aware that backporting security patches from recent versions to old versions |
| 53 | cannot always work.** (There are several reasons for this: The patch won't apply |
| 54 | to old versions; the solution was to add or remove a feature or change an API; |
| 55 | the issue may seem minor until it's too late; and so on.) We believe the latest |
| 56 | stable versions of Chromium and V8 are the most stable and secure. We also |
| 57 | believe that tracking the latest stable upstream is usually less work for |
| 58 | greater benefit in the long run than backporting. We strongly recommend that you |
| 59 | track the latest stable branches, and we support only the latest stable branch. |
| 60 | |
Eric Lawrence | 122e8688 | 2017-12-07 22:53:05 | [diff] [blame] | 61 | <a name="TOC-Severity-Guidelines"></a> |
| 62 | ## How does the Chrome team determine severity of security bugs? |
| 63 | |
| 64 | See the [severity guidelines](severity-guidelines.md) for more information. |
Tom Sepez | e8fb3320 | 2018-11-01 19:31:32 | [diff] [blame] | 65 | Only security issues are considered under the security vulnerability rewards |
| 66 | program. Other types of bugs, which we call "functional bugs", are not. |
Eric Lawrence | 122e8688 | 2017-12-07 22:53:05 | [diff] [blame] | 67 | |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 68 | <a name="TOC-Are-privacy-issues-considered-security-bugs-"></a> |
| 69 | ## Are privacy issues considered security bugs? |
| 70 | |
Tom Sepez | e8fb3320 | 2018-11-01 19:31:32 | [diff] [blame] | 71 | No. The Chrome Privacy team treats privacy issues, such as leaking information |
| 72 | from Incognito, fingerprinting, and bugs related to deleting browsing data as |
| 73 | functional bugs. |
| 74 | |
| 75 | Privacy issues are not considered under the security vulnerability rewards |
| 76 | program; the [severity guidelines](severity-guidelines.md) outline the types of |
| 77 | bugs that are considered security vulnerabilities in more detail. |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 78 | |
Eric Lawrence | 15fdea25 | 2017-08-09 19:37:41 | [diff] [blame] | 79 | <a name="TOC-Timing-Attacks"></a> |
| 80 | ## Are timing attacks considered security vulnerabilities? |
| 81 | |
| 82 | Some timing attacks are considered security vulnerabilities, and some are |
| 83 | considered privacy vulnerabilities. Timing attacks vary significantly in terms |
| 84 | of impact, reliability, and exploitability. |
| 85 | |
| 86 | Some timing attacks weaken mitigations like ASLR (e.g. |
| 87 | [Issue 665930](https://crbug.com/665930)). Others attempt to circumvent the same |
| 88 | origin policy, for instance, by using SVG filters to read pixels |
| 89 | cross-origin (e.g. [Issue 686253](https://crbug.com/686253) and |
| 90 | [Issue 615851](https://crbug.com/615851)). |
| 91 | |
| 92 | Many timing attacks rely upon the availability of high-resolution timing |
| 93 | information [Issue 508166](https://crbug.com/508166); such timing data often has |
| 94 | legitimate usefulness in non-attack scenarios making it unappealing to remove. |
| 95 | |
| 96 | Timing attacks against the browser's HTTP Cache (like |
| 97 | [Issue 74987](https://crbug.com/74987)) can potentially leak information about |
| 98 | which sites the user has previously loaded. The browser could attempt to protect |
| 99 | against such attacks (e.g. by bypassing the cache) at the cost of performance |
| 100 | and thus user-experience. To mitigate against such timing attacks, end-users can |
| 101 | delete browsing history and/or browse sensitive sites using Chrome's Incognito |
| 102 | or Guest browsing modes. |
| 103 | |
| 104 | Other timing attacks can be mitigated via clever design changes. For instance, |
| 105 | [Issue 544765](https://crbug.com/544765) describes an attack whereby an attacker |
| 106 | can probe for the presence of HSTS rules (set by prior site visits) by timing |
Eric Lawrence | 29ca272 | 2018-02-22 19:04:05 | [diff] [blame] | 107 | the load of resources with URLs "fixed-up" by HSTS. Prior to Chrome 64, HSTS |
| 108 | rules [were shared](https://crbug.com/774643) between regular browsing and |
| 109 | Incognito mode, making the attack more interesting. The attack was mitigated by |
| 110 | changing Content-Security-Policy such that secure URLs will match rules |
| 111 | demanding non-secure HTTP urls, a fix that has also proven useful to help to |
| 112 | unblock migrations to HTTPS. Similarly, [Issue 707071](https://crbug.com/707071) |
| 113 | describes a timing attack in which an attacker could determine what Android |
| 114 | applications are installed; the attack was mitigated by introducing randomness |
| 115 | in the execution time of the affected API. |
Eric Lawrence | 15fdea25 | 2017-08-09 19:37:41 | [diff] [blame] | 116 | |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 117 | <a name="TOC-What-are-the-security-and-privacy-guarantees-of-Incognito-mode-"></a> |
| 118 | ## What are the security and privacy guarantees of Incognito mode? |
| 119 | |
| 120 | Bugs in Incognito mode are tracked as privacy bugs, not security bugs. |
| 121 | |
Chris Palmer | 9839ce4 | 2017-08-16 20:59:15 | [diff] [blame] | 122 | The [Help Center](https://support.google.com/chrome/?p=cpn_incognito) explains |
| 123 | what privacy protections Incognito mode attempts to enforce. In particular, |
| 124 | please note that Incognito is not a “do not track” mode, and it does not hide |
| 125 | aspects of your identity from web sites. Chrome does offer a way to send Do Not |
| 126 | Track request to servers; see chrome://settings/?search=do+not+track |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 127 | |
| 128 | When in Incognito mode, Chrome does not store any new history, cookies, or other |
| 129 | state in non-volatile storage. However, Incognito windows will be able to access |
| 130 | some previously-stored state, such as browsing history. |
| 131 | |
| 132 | <a name="TOC-Are-denial-of-service-issues-considered-security-bugs-"></a> |
| 133 | ## Are denial of service issues considered security bugs? |
| 134 | |
Tom Sepez | e8fb3320 | 2018-11-01 19:31:32 | [diff] [blame] | 135 | No. Denial of Service (DoS) issues are treated as **abuse** or **stability** |
| 136 | issues rather than security vulnerabilities. |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 137 | |
| 138 | * If you find a reproducible crash, we encourage you to [report |
| 139 | it](https://bugs.chromium.org/p/chromium/issues/entry?template=Crash%20Report). |
| 140 | * If you find a site that is abusing the user experience (e.g. preventing you |
| 141 | from leaving a site), we encourage you to [report |
| 142 | it](https://crbug.com/new). |
| 143 | |
| 144 | DoS issues are not considered under the security vulnerability rewards program; |
Varun Khaneja | df1bc00e | 2017-08-10 05:22:40 | [diff] [blame] | 145 | the [severity guidelines](severity-guidelines.md) outline the types of bugs that |
| 146 | are considered security vulnerabilities in more detail. |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 147 | |
| 148 | <a name="TOC-Are-XSS-filter-bypasses-considered-security-bugs-"></a> |
| 149 | ## Are XSS filter bypasses considered security bugs? |
| 150 | |
Eric Lawrence | 81dba445 | 2019-08-21 14:28:20 | [diff] [blame] | 151 | No. Chromium once contained a reflected XSS filter called the [XSSAuditor](https://www.chromium.org/developers/design-documents/xss-auditor) |
| 152 | that was a best-effort second line of defense against reflected XSS flaws found |
| 153 | in web sites. The XSS Auditor was [removed in Chrome 78](https://groups.google.com/a/chromium.org/forum/#!msg/blink-dev/TuYw-EZhO9g/blGViehIAwAJ). |
Tom Sepez | e8fb3320 | 2018-11-01 19:31:32 | [diff] [blame] | 154 | |
Alex Gough | a98378c | 2020-07-08 16:31:55 | [diff] [blame] | 155 | <a name="TOC-What-if-a-Chrome-component-breaks-an-OS-security-boundary-"</a> |
| 156 | ## What if a Chrome component breaks an OS security boundary? |
| 157 | |
| 158 | If Chrome or any of its components (e.g. updater) can be abused to |
| 159 | perform a local privilege escalation, then it may be treated as a |
| 160 | valid security vulnerability. |
| 161 | |
| 162 | Running any Chrome component with higher privileges than intended is |
| 163 | not a security bug and we do not recommend running Chrome as an |
| 164 | Administrator on Windows, or as root on POSIX. |
| 165 | |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 166 | <a name="TOC-Why-aren-t-physically-local-attacks-in-Chrome-s-threat-model-"></a> |
| 167 | ## Why aren't physically-local attacks in Chrome's threat model? |
| 168 | |
| 169 | People sometimes report that they can compromise Chrome by installing a |
| 170 | malicious DLL in a place where Chrome will load it, by hooking APIs (e.g. [Issue |
| 171 | 130284](https://crbug.com/130284)), or by otherwise altering the configuration |
Tom Sepez | f6b2e78 | 2020-04-06 23:08:55 | [diff] [blame] | 172 | of the device. |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 173 | |
| 174 | We consider these attacks outside Chrome's threat model, because there is no way |
| 175 | for Chrome (or any application) to defend against a malicious user who has |
Tom Sepez | f6b2e78 | 2020-04-06 23:08:55 | [diff] [blame] | 176 | managed to log into your device as you, or who can run software with the |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 177 | privileges of your operating system user account. Such an attacker can modify |
| 178 | executables and DLLs, change environment variables like `PATH`, change |
| 179 | configuration files, read any data your user account owns, email it to |
Tom Sepez | f6b2e78 | 2020-04-06 23:08:55 | [diff] [blame] | 180 | themselves, and so on. Such an attacker has total control over your device, |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 181 | and nothing Chrome can do would provide a serious guarantee of defense. This |
| 182 | problem is not special to Chrome — all applications must trust the |
| 183 | physically-local user. |
| 184 | |
| 185 | There are a few things you can do to mitigate risks from people who have |
| 186 | physical control over **your** computer, in certain circumstances. |
| 187 | |
| 188 | * To stop people from reading your data in cases of device theft or loss, use |
| 189 | full disk encryption (FDE). FDE is a standard feature of most operating |
| 190 | systems, including Windows Vista and later, Mac OS X Lion and later, and |
| 191 | some distributions of Linux. (Some older versions of Mac OS X had partial |
| 192 | disk encryption: they could encrypt the user’s home folder, which contains |
| 193 | the bulk of a user’s sensitive data.) Some FDE systems allow you to use |
| 194 | multiple sources of key material, such as the combination of both a |
| 195 | password and a key file on a USB token. When available, you should use |
| 196 | multiple sources of key material to achieve the strongest defense. Chrome |
| 197 | OS encrypts users’ home directories. |
| 198 | * If you share your computer with other people, take advantage of your |
| 199 | operating system’s ability to manage multiple login accounts, and use a |
| 200 | distinct account for each person. For guests, Chrome OS has a built-in |
| 201 | Guest account for this purpose. |
| 202 | * Take advantage of your operating system’s screen lock feature. |
| 203 | * You can reduce the amount of information (including credentials like |
| 204 | cookies and passwords) that Chrome will store locally by using Chrome's |
| 205 | Content Settings (chrome://settings/content) and turning off the form |
| 206 | auto-fill and password storage features |
| 207 | ([chrome://settings/search#password](chrome://settings/search#password)). |
| 208 | |
| 209 | There is almost nothing you can do to mitigate risks when using a **public** |
| 210 | computer. |
| 211 | |
| 212 | * Assume everything you do on a public computer will become, well, public. |
| 213 | You have no control over the operating system or other software on the |
| 214 | machine, and there is no reason to trust the integrity of it. |
Eric Lawrence | 29ca272 | 2018-02-22 19:04:05 | [diff] [blame] | 215 | * If you must use such a computer, use Incognito mode and close all Incognito |
| 216 | windows when you are done browsing to limit the amount of data you leave |
| 217 | behind. Note that Incognito mode **provides no protection** if the system has |
| 218 | already been compromised as described above. |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 219 | |
| 220 | <a name="TOC-Why-aren-t-compromised-infected-machines-in-Chrome-s-threat-model-"></a> |
| 221 | ## Why aren't compromised/infected machines in Chrome's threat model? |
| 222 | |
Tom Sepez | 279d9f4 | 2020-11-30 21:58:58 | [diff] [blame^] | 223 | Although the attacker may now be remote, the consequences are essentially the |
| 224 | same as with physically-local attacks. The attacker's code, when it runs as |
| 225 | your user account on your machine, can do anything you can do. (See also |
| 226 | [Microsoft's Ten Immutable Laws Of |
Eric Lawrence | 5e1a9c71 | 2018-09-12 20:55:19 | [diff] [blame] | 227 | Security](https://web.archive.org/web/20160311224620/https://technet.microsoft.com/en-us/library/hh278941.aspx).) |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 228 | |
Tom Sepez | 279d9f4 | 2020-11-30 21:58:58 | [diff] [blame^] | 229 | Other cases covered by this section include leaving a debugger port open to |
| 230 | the world, remote shells, and so forth. |
| 231 | |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 232 | <a name="TOC-What-about-unmasking-of-passwords-with-the-developer-tools-"></a> |
| 233 | ## What about unmasking of passwords with the developer tools? |
| 234 | |
| 235 | One of the most frequent reports we receive is password disclosure using the |
| 236 | Inspect Element feature (see [Issue 126398](https://crbug.com/126398) for an |
| 237 | example). People reason that "If I can see the password, it must be a bug." |
| 238 | However, this is just one of the [physically-local attacks described in the |
| 239 | previous |
| 240 | section](#TOC-Why-aren-t-physically-local-attacks-in-Chrome-s-threat-model-), |
| 241 | and all of those points apply here as well. |
| 242 | |
| 243 | The reason the password is masked is only to prevent disclosure via |
| 244 | "shoulder-surfing" (i.e. the passive viewing of your screen by nearby persons), |
| 245 | not because it is a secret unknown to the browser. The browser knows the |
| 246 | password at many layers, including JavaScript, developer tools, process memory, |
| 247 | and so on. When you are physically local to the computer, and only when you are |
| 248 | physically local to the computer, there are, and always will be, tools for |
| 249 | extracting the password from any of these places. |
| 250 | |
| 251 | <a name="TOC-Does-entering-JavaScript:-URLs-in-the-URL-bar-or-running-script-in-the-developer-tools-mean-there-s-an-XSS-vulnerability-"></a> |
Avi Drissman | 36d4e2e | 2017-07-31 20:54:39 | [diff] [blame] | 252 | ## Does entering JavaScript: URLs in the URL bar or running script in the developer tools mean there's an XSS vulnerability? |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 253 | |
Eric Lawrence | 29ca272 | 2018-02-22 19:04:05 | [diff] [blame] | 254 | [No](https://crbug.com/81697). Chrome does not attempt to prevent the user from |
| 255 | knowingly running script against loaded documents, either by entering script in |
| 256 | the Developer Tools console or by typing a JavaScript: URI into the URL bar. |
| 257 | Chrome and other browsers do undertake some efforts to prevent *paste* of script |
| 258 | URLs in the URL bar (to limit |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 259 | [social-engineering](https://blogs.msdn.microsoft.com/ieinternals/2011/05/19/socially-engineered-xss-attacks/)) |
| 260 | but users are otherwise free to invoke script against pages using either the URL |
| 261 | bar or the DevTools console. |
| 262 | |
Tom Sepez | 5b70048 | 2020-04-06 20:07:21 | [diff] [blame] | 263 | <a name="TOC-Does-executing-JavaScript-from-a-bookmark-mean-there-s-an-XSS-vulnerability-"></a> |
| 264 | ## Does executing JavaScript from a bookmark mean there's an XSS vulnerability? |
| 265 | |
| 266 | No. Chromium allows users to create bookmarks to JavaScript URLs that will run |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 267 | on the currently-loaded page when the user clicks the bookmark; these are called |
| 268 | [bookmarklets](https://en.wikipedia.org/wiki/Bookmarklet). |
| 269 | |
Tom Sepez | feca2de | 2020-04-01 22:58:29 | [diff] [blame] | 270 | <a name="TOC-Does-executing-JavaScript-in-a-PDF-file-mean-there-s-an-XSS-vulnerability-"></a> |
| 271 | ## Does executing JavaScript in a PDF file mean there's an XSS vulnerability? |
| 272 | |
| 273 | No. PDF files have the ability to run JavaScript, usually to facilitate field |
| 274 | validation during form fill-out. Note that the set of bindings provided to |
| 275 | the PDF are more limited than those provided by the DOM to HTML documents (e.g. |
| 276 | no document.cookie). |
| 277 | |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 278 | <a name="TOC-Is-Chrome-s-support-for-userinfo-in-HTTP-URLs-e.g.-http:-user:password-example.com-considered-a-vulnerability-"></a> |
| 279 | ## Is Chrome's support for userinfo in HTTP URLs (e.g. http://user:[email protected]) considered a vulnerability? |
| 280 | |
| 281 | [Not at this time](https://crbug.com/626951). Chrome supports HTTP and HTTPS |
| 282 | URIs with username and password information embedded within them for |
| 283 | compatibility with sites that require this feature. Notably, Chrome will |
| 284 | suppress display of the username and password information after navigation in |
| 285 | the URL box to limit the effectiveness of spoofing attacks that may try to |
| 286 | mislead the user. For instance, navigating to |
| 287 | `http://[email protected]` will show an address of |
| 288 | `http://evil.example.com` after the page loads. |
| 289 | |
| 290 | <a name="TOC-Why-isn-t-passive-browser-fingerprinting-including-passive-cookies-in-Chrome-s-threat-model-"></a> |
Chris Palmer | 8d95482a | 2019-08-28 22:48:45 | [diff] [blame] | 291 | <a name="TOC-What-is-Chrome-s-threat-model-for-fingerprinting-"></a> |
| 292 | ## What is Chrome's threat model for fingerprinting? |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 293 | |
Chris Palmer | 8d95482a | 2019-08-28 22:48:45 | [diff] [blame] | 294 | > **Update, August 2019:** Please note that this answer has changed. We have |
| 295 | > updated our threat model to include fingerprinting. |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 296 | |
Chris Palmer | 8d95482a | 2019-08-28 22:48:45 | [diff] [blame] | 297 | Although [we do not consider fingerprinting issues to be *security |
| 298 | vulnerabilities*](#TOC-Are-privacy-issues-considered-security-bugs-), we do now |
| 299 | consider them to be privacy bugs that we will try to resolve. We distinguish two |
| 300 | forms of fingerprinting. |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 301 | |
Chris Palmer | 8d95482a | 2019-08-28 22:48:45 | [diff] [blame] | 302 | * **Passive fingerprinting** refers to fingerprinting techniques that do not |
| 303 | require a JavaScript API call to achieve. This includes (but is not limited to) |
| 304 | mechanisms like [ETag |
| 305 | cookies](https://en.wikipedia.org/wiki/HTTP_ETag#Tracking_using_ETags) and [HSTS |
| 306 | cookies](https://security.stackexchange.com/questions/79518/what-are-hsts-super-cookies). |
| 307 | * **Active fingerprinting** refers to fingerprinting techniques that do require |
| 308 | a JavaScript API call to achieve. Examples include most of the techniques in |
| 309 | [EFF's Panopticlick proof of concept](https://panopticlick.eff.org). |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 310 | |
Chris Palmer | 8d95482a | 2019-08-28 22:48:45 | [diff] [blame] | 311 | For passive fingerprinting, our ultimate goal is (to the extent possible) to |
| 312 | reduce the information content available to below the threshold for usefulness. |
| 313 | |
| 314 | For active fingerprinting, our ultimate goal is to establish a [privacy |
| 315 | budget](https://github.com/bslassey/privacy-budget) and to keep web origins |
| 316 | below the budget (such as by rejecting some API calls when the origin exceeds |
| 317 | its budget). To avoid breaking rich web applications that people want to use, |
| 318 | Chrome may increase an origin's budget when it detects that a person is using |
| 319 | the origin heavily. As with passive fingerprinting, our goal is to set the |
| 320 | default budget below the threshold of usefulness for fingerprinting. |
| 321 | |
| 322 | These are both long-term goals. As of this writing (August 2019) we do not |
| 323 | expect that Chrome will immediately achieve them. |
| 324 | |
| 325 | For background on fingerprinting and the difficulty of stopping it, see [Arvind |
| 326 | Narayanan's site](https://33bits.wordpress.com/about/) and [Peter Eckersley's |
| 327 | discussion of the information theory behind |
| 328 | Panopticlick](https://www.eff.org/deeplinks/2010/01/primer-information-theory-and-privacy). |
| 329 | There is also [a pretty good analysis of in-browser fingerprinting |
| 330 | vectors](https://dev.chromium.org/Home/chromium-security/client-identification-mechanisms). |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 331 | |
| 332 | <a name="TOC-Where-are-the-security-indicators-located-in-the-browser-window-"></a> |
| 333 | ## Where are the security indicators located in the browser window? |
| 334 | |
| 335 | The topmost portion of the browser window, consisting of the **Omnibox** (or |
| 336 | **Location Bar**), navigation icons, menu icon, and other indicator icons, is |
| 337 | sometimes called the browser **chrome** (not to be confused with the Chrome |
| 338 | Browser itself). Actual security indicators can only appear in this section of |
| 339 | the window. There can be no trustworthy security indicators elsewhere. |
| 340 | |
| 341 | Furthermore, Chrome can only guarantee that it is correctly representing URLs |
| 342 | and their origins at the end of all navigation. Quirks of URL parsing, HTTP |
| 343 | redirection, and so on are not security concerns unless Chrome is |
| 344 | misrepresenting a URL or origin after navigation has completed. |
| 345 | |
| 346 | Browsers present a dilemma to the user since the output is a combination of |
| 347 | information coming from both trustworthy sources (the browser itself) and |
| 348 | untrustworthy sources (the web page), and the untrustworthy sources are allowed |
| 349 | virtually unlimited control over graphical presentation. The only restriction on |
| 350 | the page's presentation is that it is confined to the large rectangular area |
| 351 | directly underneath the chrome, called the **viewport**. Things like hover text |
| 352 | and URL preview(s), shown in the viewport, are entirely under the control of the |
| 353 | web page itself. They have no guaranteed meaning, and function only as the page |
| 354 | desires. This can be even more confusing when pages load content that looks like |
| 355 | chrome. For example, many pages load images of locks, which look similar to the |
| 356 | meaningful HTTPS lock in the Omnibox, but in fact do not convey any meaningful |
| 357 | information about the transport security of that page. |
| 358 | |
| 359 | When the browser needs to show trustworthy information, such as the bubble |
| 360 | resulting from a click on the lock icon, it does so by making the bubble overlap |
| 361 | chrome. In the case of the lock bubble, it is a small triangular bump in the |
| 362 | border of the bubble that overlays the chrome. This visual detail can't be |
| 363 | imitated by the page itself since the page is confined to the viewport. |
| 364 | |
| 365 | <a name="TOC-Why-does-Chrome-show-a-green-lock-even-if-my-HTTPS-connection-is-being-proxied-"></a> |
| 366 | ## Why does Chrome show a green lock, even if my HTTPS connection is being proxied? |
| 367 | |
| 368 | Some types of software intercept HTTPS connections. Examples include anti-virus |
| 369 | software, corporate network monitoring tools, and school censorship software. In |
| 370 | order for the interception to work, you need to install a private trust anchor |
| 371 | (root certificate) onto your computer. This may have happened when you installed |
| 372 | your anti-virus software, or when your company's network administrator set up |
| 373 | your computer. If that has occurred, your HTTPS connections can be viewed or |
| 374 | modified by the software. |
| 375 | |
| 376 | Since you have allowed the trust anchor to be installed onto your computer, |
| 377 | Chrome assumes that you have consented to HTTPS interception. Anyone who can add |
| 378 | a trust anchor to your computer can make other changes to your computer, too, |
| 379 | including changing Chrome. (See also [Why aren't physically-local attacks in |
Avi Drissman | 36d4e2e | 2017-07-31 20:54:39 | [diff] [blame] | 380 | Chrome's threat model?](#TOC-Why-aren-t-physically-local-attacks-in-Chrome-s-threat-model-).) |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 381 | |
| 382 | <a name="TOC-Why-can-t-I-select-Proceed-Anyway-on-some-HTTPS-error-screens-"></a> |
| 383 | ## Why can’t I select Proceed Anyway on some HTTPS error screens? |
| 384 | |
| 385 | A key guarantee of HTTPS is that Chrome can be relatively certain that it is |
| 386 | connecting to the true web server and not an impostor. Some sites request an |
| 387 | even higher degree of protection for their users (i.e. you): they assert to |
| 388 | Chrome (via Strict Transport Security — |
Xiaoyin Liu | b7985e5 | 2017-09-21 18:07:46 | [diff] [blame] | 389 | [HSTS](https://tools.ietf.org/html/rfc6797) — or by other means) that any |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 390 | server authentication error should be fatal, and that Chrome must close the |
| 391 | connection. If you encounter such a fatal error, it is likely that your network |
| 392 | is under attack, or that there is a network misconfiguration that is |
| 393 | indistinguishable from an attack. |
| 394 | |
| 395 | The best thing you can do in this situation is to raise the issue to your |
| 396 | network provider (or corporate IT department). |
| 397 | |
| 398 | Chrome shows non-recoverable HTTPS errors only in cases where the true server |
| 399 | has previously asked for this treatment, and when it can be relatively certain |
| 400 | that the current server is not the true server. |
| 401 | |
| 402 | <a name="TOC-How-does-key-pinning-interact-with-local-proxies-and-filters-"></a> |
| 403 | ## How does key pinning interact with local proxies and filters? |
| 404 | |
| 405 | To enable certificate chain validation, Chrome has access to two stores of trust |
Adam Langley | c078ba8 | 2018-12-17 17:25:46 | [diff] [blame] | 406 | anchors (i.e. certificates that are empowered as issuers). One trust anchor |
| 407 | store is the system or public trust anchor store, and the other other is the |
| 408 | local or private trust anchor store. The public store is provided as part of |
| 409 | the operating system, and intended to authenticate public internet servers. The |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 410 | private store contains certificates installed by the user or the administrator |
| 411 | of the client machine. Private intranet servers should authenticate themselves |
| 412 | with certificates issued by a private trust anchor. |
| 413 | |
| 414 | Chrome’s key pinning feature is a strong form of web site authentication that |
| 415 | requires a web server’s certificate chain not only to be valid and to chain to a |
| 416 | known-good trust anchor, but also that at least one of the public keys in the |
| 417 | certificate chain is known to be valid for the particular site the user is |
| 418 | visiting. This is a good defense against the risk that any trust anchor can |
| 419 | authenticate any web site, even if not intended by the site owner: if an |
| 420 | otherwise-valid chain does not include a known pinned key (“pin”), Chrome will |
| 421 | reject it because it was not issued in accordance with the site operator’s |
| 422 | expectations. |
| 423 | |
| 424 | Chrome does not perform pin validation when the certificate chain chains up to a |
| 425 | private trust anchor. A key result of this policy is that private trust anchors |
| 426 | can be used to proxy (or |
Xiaoyin Liu | b7985e5 | 2017-09-21 18:07:46 | [diff] [blame] | 427 | [MITM](https://en.wikipedia.org/wiki/Man-in-the-middle_attack)) connections, even |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 428 | to pinned sites. “Data loss prevention” appliances, firewalls, content filters, |
| 429 | and malware can use this feature to defeat the protections of key pinning. |
| 430 | |
| 431 | We deem this acceptable because the proxy or MITM can only be effective if the |
| 432 | client machine has already been configured to trust the proxy’s issuing |
| 433 | certificate — that is, the client is already under the control of the person who |
| 434 | controls the proxy (e.g. the enterprise’s IT administrator). If the client does |
| 435 | not trust the private trust anchor, the proxy’s attempt to mediate the |
| 436 | connection will fail as it should. |
| 437 | |
Adam Langley | c078ba8 | 2018-12-17 17:25:46 | [diff] [blame] | 438 | <a name="TOC-When-is-key-pinning-enabled-"></a> |
| 439 | ## When is key pinning enabled? |
| 440 | |
| 441 | Key pinning is enabled for Chrome-branded, non-mobile builds when the local |
| 442 | clock is within ten weeks of the embedded build timestamp. Key pinning is a |
| 443 | useful security measure but it tightly couples client and server configurations |
| 444 | and completely breaks when those configurations are out of sync. In order to |
| 445 | manage that risk we need to ensure that we can promptly update pinning clients |
Chris Palmer | 59877ec | 2019-11-22 01:28:09 | [diff] [blame] | 446 | in an emergency and ensure that non-emergency changes can be deployed in a |
Adam Langley | c078ba8 | 2018-12-17 17:25:46 | [diff] [blame] | 447 | reasonable timeframe. |
| 448 | |
| 449 | Each of the conditions listed above helps ensure those properties: |
| 450 | Chrome-branded builds are those that Google provides and they all have an |
| 451 | auto-update mechanism that can be used in an emergency. However, auto-update on |
| 452 | mobile devices is significantly less effective thus they are excluded. Even in |
| 453 | cases where auto-update is generally effective, there are still non-trivial |
| 454 | populations of stragglers for various reasons. The ten-week timeout prevents |
| 455 | those stragglers from causing problems for regular, non-emergency changes and |
| 456 | allows stuck users to still, for example, conduct searches and access Chrome's |
| 457 | homepage to hopefully get unstuck. |
| 458 | |
| 459 | In order to determine whether key pinning is active, try loading |
| 460 | [https://pinningtest.appspot.com](https://pinningtest.appspot.com). If key |
| 461 | pinning is active the load will _fail_ with a pinning error. |
| 462 | |
Chris Palmer | 38d751d00 | 2017-08-23 17:37:35 | [diff] [blame] | 463 | <a name="TOC-How-does-certificate-transparency-interact-with-local-proxies-and-filters-"></a> |
| 464 | ## How does Certificate Transparency interact with local proxies and filters? |
| 465 | |
Chris Palmer | 413f3c0 | 2017-08-23 17:47:54 | [diff] [blame] | 466 | Just as [pinning only applies to publicly-trusted trust |
| 467 | anchors](#TOC-How-does-key-pinning-interact-with-local-proxies-and-filters-), |
| 468 | Chrome only evaluates Certificate Transparency (CT) for publicly-trusted trust |
| 469 | anchors. Thus private trust anchors, such as for enterprise middle-boxes and AV |
| 470 | proxies, do not need to be publicly logged in a CT log. |
Chris Palmer | 38d751d00 | 2017-08-23 17:37:35 | [diff] [blame] | 471 | |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 472 | <a name="TOC-Can-I-use-EMET-to-help-protect-Chrome-against-attack-on-Microsoft-Windows-"></a> |
| 473 | ## Can I use EMET to help protect Chrome against attack on Microsoft Windows? |
| 474 | |
| 475 | There are [known compatibility |
| 476 | problems](https://sites.google.com/a/chromium.org/dev/Home/chromium-security/chromium-and-emet) |
| 477 | between Microsoft's EMET anti-exploit toolkit and some versions of Chrome. These |
| 478 | can prevent Chrome from running in some configurations. Moreover, the Chrome |
| 479 | security team does not recommend the use of EMET with Chrome because its most |
| 480 | important security benefits are redundant with or superseded by built-in attack |
| 481 | mitigations within the browser. For users, the very marginal security benefit is |
| 482 | not usually a good trade-off for the compatibility issues and performance |
| 483 | degradation the toolkit can cause. |
| 484 | |
| 485 | <a name="TOC-Why-are-some-web-platform-features-only-available-in-HTTPS-page-loads-"></a> |
| 486 | ## Why are some web platform features only available in HTTPS page-loads? |
| 487 | |
| 488 | The full answer is here: we [Prefer Secure Origins For Powerful New |
| 489 | Features](https://www.chromium.org/Home/chromium-security/prefer-secure-origins-for-powerful-new-features). |
| 490 | In short, many web platform features give web origins access to sensitive new |
| 491 | sources of information, or significant power over a user's experience with their |
Eric Roman | ed127b67 | 2018-01-23 19:36:38 | [diff] [blame] | 492 | computer/phone/watch/etc., or over their experience with it. We would therefore |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 493 | like to have some basis to believe the origin meets a minimum bar for security, |
| 494 | that the sensitive information is transported over the Internet in an |
Eric Roman | ed127b67 | 2018-01-23 19:36:38 | [diff] [blame] | 495 | authenticated and confidential way, and that users can make meaningful choices |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 496 | to trust or not trust a web origin. |
| 497 | |
| 498 | Note that the reason we require secure origins for WebCrypto is slightly |
| 499 | different: An application that uses WebCrypto is almost certainly using it to |
| 500 | provide some kind of security guarantee (e.g. encrypted instant messages or |
| 501 | email). However, unless the JavaScript was itself transported to the client |
| 502 | securely, it cannot actually provide any guarantee. (After all, a MITM attacker |
| 503 | could have modified the code, if it was not transported securely.) |
| 504 | |
| 505 | <a name="TOC-Which-origins-are-secure-"></a> |
| 506 | ## Which origins are "secure"? |
| 507 | |
| 508 | Secure origins are those that match at least one of the following (scheme, host, |
| 509 | port) patterns: |
| 510 | |
| 511 | * (https, *, *) |
| 512 | * (wss, *, *) |
| 513 | * (*, localhost, *) |
| 514 | * (*, 127/8, *) |
| 515 | * (*, ::1/128, *) |
| 516 | * (file, *, —) |
| 517 | * (chrome-extension, *, —) |
| 518 | |
| 519 | That is, secure origins are those that load resources either from the local |
| 520 | machine (necessarily trusted) or over the network from a |
| 521 | cryptographically-authenticated server. See [Prefer Secure Origins For Powerful |
| 522 | New |
| 523 | Features](https://sites.google.com/a/chromium.org/dev/Home/chromium-security/prefer-secure-origins-for-powerful-new-features) |
| 524 | for more details. |
| 525 | |
| 526 | <a name="TOC-What-s-the-story-with-certificate-revocation-"></a> |
| 527 | ## What's the story with certificate revocation? |
| 528 | |
| 529 | Chrome's primary mechanism for checking the revocation status of HTTPS |
| 530 | certificates is |
Xiaoyin Liu | b7985e5 | 2017-09-21 18:07:46 | [diff] [blame] | 531 | [CRLsets](https://dev.chromium.org/Home/chromium-security/crlsets). |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 532 | |
| 533 | Chrome also supports Online Certificate Status Protocol (OCSP). However, the |
| 534 | effectiveness of OCSP is is essentially 0 unless the client fails hard (refuses |
| 535 | to connect) if it cannot get a live, valid OCSP response. No browser has OCSP |
| 536 | set to hard-fail by default, for good reasons explained by Adam Langley (see |
| 537 | [https://www.imperialviolet.org/2014/04/29/revocationagain.html](https://www.imperialviolet.org/2014/04/29/revocationagain.html) and |
| 538 | [https://www.imperialviolet.org/2014/04/19/revchecking.html](https://www.imperialviolet.org/2014/04/19/revchecking.html)). |
| 539 | |
| 540 | Stapled OCSP with the Must Staple option (hard-fail if a valid OCSP response is |
| 541 | not stapled to the certificate) is a much better solution to the revocation |
| 542 | problem than non-stapled OCSP. CAs and browsers are working toward that solution |
| 543 | (see the |
Xiaoyin Liu | b7985e5 | 2017-09-21 18:07:46 | [diff] [blame] | 544 | [Internet-Draft](https://tools.ietf.org/html/draft-hallambaker-tlssecuritypolicy-03)). |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 545 | |
| 546 | Additionally, non-stapled OCSP poses a privacy problem: in order to check the |
| 547 | status of a certificate, the client must query an OCSP responder for the status |
| 548 | of the certificate, thus exposing a user's HTTPS browsing history to the |
| 549 | responder (a third party). |
| 550 | |
| 551 | That said, you can use enterprise policies to [enable soft-fail |
Felipe Andrade | 239aaf3c | 2019-09-11 12:31:58 | [diff] [blame] | 552 | OCSP](https://cloud.google.com/docs/chrome-enterprise/policies/?policy=EnableOnlineRevocationChecks) |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 553 | and hard-fail OCSP for [local trust |
Felipe Andrade | 239aaf3c | 2019-09-11 12:31:58 | [diff] [blame] | 554 | anchors](https://cloud.google.com/docs/chrome-enterprise/policies/?policy=RequireOnlineRevocationChecksForLocalAnchors). |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 555 | |
| 556 | Chrome performs online checking for [Extended |
| 557 | Validation](https://cabforum.org/about-ev-ssl/) certificates if it does not |
| 558 | already have a non-expired CRLSet entry covering the domain. If Chrome does not |
| 559 | get a response, it simply downgrades the security indicator to Domain Validated. |
| 560 | |
| 561 | See also [Issue 361820](https://crbug.com/361820) for more discussion of the |
| 562 | user-facing UX. |
| 563 | |
| 564 | <a name="TOC-Why-does-the-Password-Manager-ignore-autocomplete-off-for-password-fields-"></a> |
| 565 | ## Why does the Password Manager ignore `autocomplete='off'` for password fields? |
| 566 | |
| 567 | Ignoring `autocomplete='off'` for password fields allows the password manager to |
| 568 | give more power to users to manage their credentials on websites. It is the |
| 569 | security team's view that this is very important for user security by allowing |
| 570 | users to have unique and more complex passwords for websites. As it was |
| 571 | originally implemented, autocomplete='off' for password fields took control away |
| 572 | from the user and gave control to the web site developer, which was also a |
| 573 | violation of the [priority of |
| 574 | constituencies](http://www.schemehostport.com/2011/10/priority-of-constituencies.html). |
| 575 | For a longer discussion on this, see the [mailing list |
| 576 | announcement](https://groups.google.com/a/chromium.org/forum/#!topic/chromium-dev/zhhj7hCip5c). |
| 577 | |
Eric Lawrence | 122e8688 | 2017-12-07 22:53:05 | [diff] [blame] | 578 | <a name="TOC-Signout-of-Chrome"></a> |
| 579 | ## Signing out of Chrome does not delete previously-synced data? |
| 580 | |
| 581 | If you have signed into Chrome and subsequently sign out of Chrome, previously |
| 582 | saved passwords and other data are not deleted from your device unless you |
| 583 | select that option when signing out of Chrome. |
| 584 | |
| 585 | If you change your Google password, synced data will no longer be updated in |
| 586 | Chrome instances until you provide the new password to Chrome on each device |
| 587 | configured to sync. However, previously synced data [remains available](https://crbug.com/792967) |
| 588 | on each previously-syncing device unless manually removed. |
| 589 | |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 590 | <a name="TOC-Why-doesn-t-the-Password-Manager-save-my-Google-password-if-I-am-using-Chrome-Sync-"></a> |
| 591 | ## Why doesn't the Password Manager save my Google password if I am using Chrome Sync? |
| 592 | |
| 593 | In its default mode, Chrome Sync uses your Google password to protect all the |
| 594 | other passwords in the Chrome Password Manager. |
| 595 | |
| 596 | In general, it is a bad idea to store the credential that protects an asset in |
| 597 | the same place as the asset itself. An attacker who could temporarily compromise |
| 598 | the Chrome Password Manager could, by stealing your Google password, obtain |
| 599 | continuing access to all your passwords. Imagine you store your valuables in a |
| 600 | safe, and you accidentally forget to close the safe. If a thief comes along, |
| 601 | they might steal all of your valuables. That’s bad, but imagine if you had also |
| 602 | left the combination to the safe inside as well. Now the bad guy has access to |
| 603 | all of your valuables and all of your future valuables, too. The password |
| 604 | manager is similar, except you probably would not even know if a bad guy |
| 605 | accessed it. |
| 606 | |
| 607 | To prevent this type of attack, Chrome Password Manager does not save the Google |
| 608 | password for the account you sync with Chrome. If you have multiple Google |
| 609 | accounts, the Chrome Password Manager will save the passwords for accounts other |
| 610 | than the one you are syncing with. |
| 611 | |
| 612 | <a name="TOC-Does-the-Password-Manager-store-my-passwords-encrypted-on-disk-"></a> |
| 613 | ## Does the Password Manager store my passwords encrypted on disk? |
| 614 | |
| 615 | Chrome generally tries to use the operating system's user storage mechanism |
| 616 | wherever possible and stores them encrypted on disk, but it is platform |
| 617 | specific: |
| 618 | |
| 619 | * On Windows, Chrome uses the [Data Protection API |
| 620 | (DPAPI)](https://msdn.microsoft.com/en-us/library/ms995355.aspx) to bind |
| 621 | your passwords to your user account and store them on disk encrypted with |
| 622 | a key only accessible to processes running as the same logged on user. |
| 623 | * On macOS, Chrome previously stored credentials directly in the user's |
| 624 | Keychain, but for technical reasons, it has switched to storing the |
| 625 | credentials in "Login Data" in the Chrome users profile directory, but |
| 626 | encrypted on disk with a key that is then stored in the user's Keychain. |
| 627 | See [Issue 466638](https://crbug.com/466638) for further explanation. |
Christos Froussios | 2a02cc5 | 2019-07-30 07:04:46 | [diff] [blame] | 628 | * On Linux, Chrome previously stored credentials directly in the user's |
| 629 | Gnome Keyring or KWallet, but for technical reasons, it has switched to |
| 630 | storing the credentials in "Login Data" in the Chrome user's profile directory, |
| 631 | but encrypted on disk with a key that is then stored in the user's Gnome |
| 632 | Keyring or KWallet. If there is no available Keyring or KWallet, the data is |
| 633 | not encrypted when stored. |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 634 | * On iOS, passwords are currently stored directly in the iOS Keychain and |
| 635 | referenced from the rest of the metadata stored in a separate DB. The plan |
| 636 | there is to just store them in plain text in the DB, because iOS gives |
| 637 | strong guarantees about only Chrome being able to access its storage. See |
| 638 | [Issue 520437](https://crbug.com/520437) to follow this migration. |
| 639 | |
| 640 | <a name="TOC-I-found-a-phishing-or-malware-site-not-blocked-by-Safe-Browsing.-Is-this-a-security-vulnerability-"></a> |
| 641 | ## I found a phishing or malware site not blocked by Safe Browsing. Is this a security vulnerability? |
| 642 | |
| 643 | Malicious sites not yet blocked by Safe Browsing can be reported via |
| 644 | [https://www.google.com/safebrowsing/report_phish/](https://www.google.com/safebrowsing/report_phish/). |
| 645 | Safe Browsing is primarily a blocklist of known-unsafe sites; the feature warns |
| 646 | the user if they attempt to navigate to a site known to deliver phishing or |
| 647 | malware content. You can learn more about this feature in these references: |
| 648 | |
| 649 | * [https://developers.google.com/safe-browsing/](https://developers.google.com/safe-browsing/) |
| 650 | * [https://www.google.com/transparencyreport/safebrowsing/](https://www.google.com/transparencyreport/safebrowsing/) |
| 651 | |
| 652 | In general, it is not considered a security bug if a given malicious site is not |
| 653 | blocked by the Safe Browsing feature, unless the site is on the blocklist but is |
| 654 | allowed to load anyway. For instance, if a site found a way to navigate through |
| 655 | the blocking red warning page without user interaction, that would be a security |
| 656 | bug. A malicious site may exploit a security vulnerability (for instance, |
| 657 | spoofing the URL in the **Location Bar**). This would be tracked as a security |
| 658 | vulnerability in the relevant feature, not Safe Browsing itself. |
| 659 | |
| 660 | <a name="TOC-What-is-the-security-story-for-Service-Workers-"></a> |
| 661 | ## What is the security story for Service Workers? |
| 662 | |
| 663 | See our dedicated [Service Worker Security |
Eric Lawrence | 15fdea25 | 2017-08-09 19:37:41 | [diff] [blame] | 664 | FAQ](https://chromium.googlesource.com/chromium/src/+/master/docs/security/service-worker-security-faq.md). |
Chris Palmer | 554c66e | 2017-07-29 01:02:56 | [diff] [blame] | 665 | |
Mustafa Emre Acer | fbff231b | 2019-07-01 19:28:59 | [diff] [blame] | 666 | <a name="TOC-What-about-URL-spoofs-using-Internationalized-Domain-Names-IDN-"></a> |
| 667 | ## What about URL spoofs using Internationalized Domain Names (IDN)? |
| 668 | |
| 669 | We try to balance the needs of our international userbase while protecting users |
| 670 | against confusable homograph attacks. Despite this, there are a list of known |
| 671 | IDN display issues we are still working on. |
| 672 | |
| 673 | * Please see [this document](https://docs.google.com/document/d/1_xJz3J9kkAPwk3pma6K3X12SyPTyyaJDSCxTfF8Y5sU) |
| 674 | for a list of known issues and how we handle them. |
| 675 | * [This document](https://www.chromium.org/developers/design-documents/idn-in-google-chrome) |
| 676 | describes Chrome's IDN policy in detail. |
| 677 | |
Emily Stark | b1f9f7f | 2020-03-27 00:25:31 | [diff] [blame] | 678 | <a name="TOC-Chrome-silently-syncs-extensions-across-devices.-Is-this-a-security-vulnerability-"></a> |
| 679 | ## Chrome silently syncs extensions across devices. Is this a security vulnerability? |
| 680 | |
| 681 | If an attacker has access to one of a victim's devices, the attacker can install |
| 682 | an extension which will be synced to the victim's other sync-enabled |
| 683 | devices. Similarly, an attacker who phishes a victim's Google credentials can |
| 684 | sign in to Chrome as the victim and install an extension, which will be synced |
| 685 | to the victim's other sync-enabled devices. Sync thereby enables an attacker to |
| 686 | elevate phished credentials or physical access to persistent access on all of a |
| 687 | victim's sync-enabled devices. |
| 688 | |
| 689 | To mitigate this issue, Chrome only syncs extensions that have been installed |
| 690 | from the Chrome Web Store. Extensions in the Chrome Web Store are monitored for |
| 691 | abusive behavior. |
| 692 | |
| 693 | In the future, we may pursue further mitigations. However, because an attacker |
| 694 | must already have the victim's Google credentials and/or [physical access to a |
| 695 | device](#TOC-Why-aren-t-physically-local-attacks-in-Chrome-s-threat-model), we |
| 696 | don't consider this attack a security vulnerability. |
| 697 | |
| 698 | We **do** consider it a vulnerability if an attacker can get an extension to |
| 699 | sync to a victim's device without either of the above preconditions. For |
| 700 | example, we consider it a vulnerability if an attacker could craft a request to |
| 701 | Google's sync servers that causes an extension to be installed to a user's |
| 702 | device, or if an attacker could entice a victim to visit a webpage that causes |
| 703 | an extension to be installed on their device(s). Please report such bugs via |
| 704 | https://bugs.chromium.org/p/chromium/issues/entry?template=Security+Bug. |
| 705 | |
Tom Sepez | feca2de | 2020-04-01 22:58:29 | [diff] [blame] | 706 | <a name="TOC-Are-PDF-files-static-content-in-Chromium-"></a> |
| 707 | ## Are PDF files static content in Chromium? |
| 708 | |
| 709 | No. PDF files have some powerful capabilities including invoking printing or |
| 710 | posting form data. To mitigate abuse of these capabiliies, such as beaconing |
| 711 | upon document open, we require interaction with the document (a "user gesture") |
| 712 | before allowing their use. |
Adrian Taylor | b3f7312 | 2020-04-30 00:56:14 | [diff] [blame] | 713 | |
| 714 | <a name="TOC-Why-arent-null-pointer-dereferences-considered-security-bugs-"></a> |
| 715 | ## Why aren't null pointer dereferences considered security bugs? |
| 716 | |
| 717 | Null pointer dereferences with consistent, small, fixed offsets are not considered |
| 718 | security bugs. A read or write to the NULL page results in a non-exploitable crash. |
| 719 | If the offset is larger than a page, or if there's uncertainty about whether the |
| 720 | offset is controllable, it is considered a security bug. |