Skip to content

Commit 70a174c

Browse files
Fixup some punctuation and grammar (#29342)
After noticing one punctuation mistake around "however", I figured there must be more. So I fixed them, and some other stuff adjacent to them.
1 parent 70d00bc commit 70a174c

File tree

25 files changed

+72
-75
lines changed

25 files changed

+72
-75
lines changed

docs/apache-airflow-providers-google/operators/cloud/dataflow.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -146,7 +146,7 @@ See: `Configuring PipelineOptions for execution on the Cloud Dataflow service <h
146146
Asynchronous execution
147147
""""""""""""""""""""""
148148

149-
Dataflow batch jobs are by default asynchronous - however this is dependent on the application code (contained in the JAR
149+
Dataflow batch jobs are by default asynchronous; however, this is dependent on the application code (contained in the JAR
150150
or Python file) and how it is written. In order for the Dataflow job to execute asynchronously, ensure the
151151
pipeline objects are not being waited upon (not calling ``waitUntilFinish`` or ``wait_until_finish`` on the
152152
``PipelineResult`` in your application code).

docs/apache-airflow-providers-http/operators.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ the response text back.
4949

5050
For historical reasons, configuring ``HTTPS`` connectivity via HTTP operator is, well, difficult and
5151
counter-intuitive. The Operator defaults to ``http`` protocol and you can change the schema used by the
52-
operator via ``scheme`` connection attribute. However this field was originally added to connection for
52+
operator via ``scheme`` connection attribute. However, this field was originally added to connection for
5353
database type of URIs, where database schemes are set traditionally as first component of URI ``path``.
5454
Therefore if you want to configure as ``https`` connection via URI, you need to pass ``https`` scheme
5555
to the SimpleHttpOperator. AS stupid as it looks, your connection URI will look like this:
@@ -58,8 +58,8 @@ the response text back.
5858
``https://your_host:443/my_endpoint`` you need to set the endpoint parameter to ``my_endpoint``.
5959
Alternatively, if you want, you could also percent-encode the host including the ``https://`` prefix,
6060
and as long it contains ``://`` (percent-encoded ``%3a%2f%2f``), the first component of the path will
61-
not be used as scheme. Your URI definition might then look like: ``http://https%3a%2f%2fblue-sea-697d.quartiers047.workers.dev%3a443%2fhttps%2fyour_host:443/``
62-
In this case however the ``path`` will not be used at all - you still need to use ``endpoint``
61+
not be used as scheme. Your URI definition might then look like ``http://https%3a%2f%2fblue-sea-697d.quartiers047.workers.dev%3a443%2fhttps%2fyour_host:443/``.
62+
In this case, however, the ``path`` will not be used at all - you still need to use ``endpoint``
6363
parameter in the task if wish to make a request with specific path. As counter-intuitive as it is, this
6464
is historically the way how the operator/hook works and it's not easy to change without breaking
6565
backwards compatibility because there are other operators build on top of the ``SimpleHttpOperator`` that

docs/apache-airflow-providers/howto/create-update-providers.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -246,7 +246,7 @@ In the ``airflow/providers/<NEW_PROVIDER>/provider.yaml`` add information of you
246246
247247
.. note:: Defining your own connection types
248248

249-
You only need to add ``connection-types`` in case you have some hooks that have customized UI behavior. However
249+
You only need to add ``connection-types`` in case you have some hooks that have customized UI behavior. However,
250250
it is only supported for Airflow 2.2.0. If your providers are also targeting Airflow below 2.2.0 you should
251251
provide the deprecated ``hook-class-names`` array. The ``connection-types`` array allows for optimization
252252
of importing of individual connections and while Airflow 2.2.0 is able to handle both definition, the
@@ -272,10 +272,10 @@ Optional provider features
272272
This feature is available in Airflow 2.3+.
273273

274274
Some providers might provide optional features, which are only available when some packages or libraries
275-
are installed. Such features will typically result in ``ImportErrors`` however those import errors
275+
are installed. Such features will typically result in ``ImportErrors``; however, those import errors
276276
should be silently ignored rather than pollute the logs of Airflow with false warnings. False warnings
277277
are a very bad pattern, as they tend to turn into blind spots, so avoiding false warnings is encouraged.
278-
However until Airflow 2.3, Airflow had no mechanism to selectively ignore "known" ImportErrors. So
278+
However, until Airflow 2.3, Airflow had no mechanism to selectively ignore "known" ImportErrors. So
279279
Airflow 2.1 and 2.2 silently ignored all ImportErrors coming from providers with actually lead to
280280
ignoring even important import errors - without giving the clue to Airflow users that there is something
281281
missing in provider dependencies.

docs/apache-airflow-providers/index.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -83,8 +83,8 @@ Logging
8383
'''''''
8484

8585
The providers can add additional task logging capabilities. By default ``Apache Airflow`` saves logs for
86-
tasks locally and make them available to Airflow UI via internal http server, however via providers
87-
you can add extra logging capabilities, where Airflow Logs can be written to a remote service and
86+
tasks locally and make them available to Airflow UI via internal http server. However, providers
87+
can add extra logging capabilities, where Airflow Logs can be written to a remote service and
8888
retrieved from those services.
8989

9090
You can see all task loggers available via community-managed providers in

docs/apache-airflow/administration-and-deployment/lineage.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ works.
6363
6464
Inlets can be a (list of) upstream task ids or statically defined as an attr annotated object
6565
as is, for example, the ``File`` object. Outlets can only be attr annotated object. Both are rendered
66-
at run time. However the outlets of a task in case they are inlets to another task will not be re-rendered
66+
at run time. However, the outlets of a task in case they are inlets to another task will not be re-rendered
6767
for the downstream task.
6868

6969
.. note:: Operators can add inlets and outlets automatically if the operator supports it.

docs/apache-airflow/administration-and-deployment/logging-monitoring/errors.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -95,9 +95,9 @@ environment variables are passed but also all existing environment variables are
9595
``SUBPROCESS_`` prefix added. This happens also for all other subprocesses.
9696

9797
This behaviour can be disabled by setting ``default_integrations`` sentry configuration parameter to
98-
``False`` which disables ``StdlibIntegration``. This however also disables other default integrations
99-
and you need to enable them manually if you want to get them enabled,
100-
see `Sentry Default Integrations <https://docs.sentry.io/platforms/python/guides/wsgi/configuration/integrations/default-integrations/>`_
98+
``False`` which disables ``StdlibIntegration``. However, this also disables other default integrations,
99+
so you need to enable them manually if you want them to remain enabled
100+
(see `Sentry Default Integrations <https://docs.sentry.io/platforms/python/guides/wsgi/configuration/integrations/default-integrations/>`_).
101101

102102
.. code-block:: ini
103103

docs/apache-airflow/administration-and-deployment/scheduler.rst

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -111,7 +111,7 @@ outline of the scheduling loop is:
111111
- Select schedulable TaskInstances, and whilst respecting Pool limits and other concurrency limits, enqueue
112112
them for execution
113113

114-
This does however place some requirements on the Database.
114+
This does, however, place some requirements on the Database.
115115

116116
.. _scheduler:ha:db_requirements:
117117

@@ -274,7 +274,7 @@ There are several areas of resource usage that you should pay attention to:
274274
which dramatically decreases performance. Note that Airflow Scheduler in versions prior to ``2.1.4``
275275
generated a lot of ``Page Cache`` memory used by log files (when the log files were not removed).
276276
This was generally harmless, as the memory is just cache and could be reclaimed at any time by the system,
277-
however in version ``2.1.4`` and beyond, writing logs will not generate excessive ``Page Cache`` memory.
277+
however, in version ``2.1.4`` and beyond, writing logs will not generate excessive ``Page Cache`` memory.
278278
Regardless - make sure when you look at memory usage, pay attention to the kind of memory you are observing.
279279
Usually you should look at ``working memory``(names might vary depending on your deployment) rather
280280
than ``total memory used``.
@@ -314,8 +314,8 @@ Scheduler Configuration options
314314
"""""""""""""""""""""""""""""""
315315

316316
The following config settings can be used to control aspects of the Scheduler.
317-
However you can also look at other non-performance-related scheduler configuration parameters available at
318-
:doc:`../configurations-ref` in ``[scheduler]`` section.
317+
However, you can also look at other non-performance-related scheduler configuration parameters available at
318+
:doc:`../configurations-ref` in the ``[scheduler]`` section.
319319

320320
- :ref:`config:scheduler__max_dagruns_to_create_per_loop`
321321

docs/apache-airflow/administration-and-deployment/security/secrets/secrets-backend/index.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -114,6 +114,6 @@ Adapt to non-Airflow compatible secret formats for connections
114114
The default implementation of Secret backend requires use of an Airflow-specific format of storing
115115
secrets for connections. Currently most community provided implementations require the connections to
116116
be stored as JSON or the Airflow Connection URI format (see
117-
:doc:`apache-airflow-providers:core-extensions/secrets-backends`). However some organizations may need to store the credentials (passwords/tokens etc) in some other way, for example if the same credentials store needs to be used for multiple data platforms, or if you are using a service with a built-in mechanism of rotating the credentials that does not work with the Airflow-specific format.
117+
:doc:`apache-airflow-providers:core-extensions/secrets-backends`). However, some organizations may need to store the credentials (passwords/tokens etc) in some other way. For example, if the same credentials store needs to be used for multiple data platforms, or if you are using a service with a built-in mechanism of rotating the credentials that does not work with the Airflow-specific format.
118118
In this case you will need to roll your own secret backend as described in the previous chapter,
119119
possibly extending an existing secrets backend and adapting it to the scheme used by your organization.

docs/apache-airflow/administration-and-deployment/security/webserver.rst

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -66,9 +66,6 @@ following CLI commands to create an account:
6666
--role Admin \
6767
6868
69-
It is however possible to switch on authentication by either using one of the supplied
70-
backends or creating your own.
71-
7269
To deactivate the authentication and allow users to be identified as Anonymous, the following entry
7370
in ``$AIRFLOW_HOME/webserver_config.py`` needs to be set with the desired role that the Anonymous
7471
user will have by default:

docs/apache-airflow/authoring-and-scheduling/dynamic-task-mapping.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -169,7 +169,7 @@ As well as a single parameter it is possible to pass multiple parameters to expa
169169
# add(x=8, y=5)
170170
# add(x=8, y=10)
171171
172-
This would result in the add task being called 6 times. Please note however that the order of expansion is not guaranteed.
172+
This would result in the add task being called 6 times. Please note, however, that the order of expansion is not guaranteed.
173173

174174
Mapping with non-TaskFlow operators
175175
===================================
@@ -522,7 +522,7 @@ There are two limits that you can place on a task:
522522

523523
If you wish to not have a large mapped task consume all available runner slots you can use the ``max_active_tis_per_dag`` setting on the task to restrict how many can be running at the same time.
524524

525-
Note however that this applies to all copies of that task against all active DagRuns, not just to this one specific DagRun.
525+
Note, however, that this applies to all copies of that task against all active DagRuns, not just to this one specific DagRun.
526526

527527
.. code-block:: python
528528

0 commit comments

Comments
 (0)