Skip to content

Commit

Permalink
MINOR: Use https instead of http in links (apache#6477)
Browse files Browse the repository at this point in the history
Verified that the https links work.

I didn't update the license header in this PR since that touches
so many files. Will file a separate one for that.

Reviewers: Manikumar Reddy <[email protected]>
  • Loading branch information
ijuma authored Apr 22, 2019
1 parent 172fbb2 commit 7d9e93a
Show file tree
Hide file tree
Showing 11 changed files with 31 additions and 31 deletions.
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
## Contributing to Kafka

*Before opening a pull request*, review the [Contributing](http://kafka.apache.org/contributing.html) and [Contributing Code Changes](https://cwiki.apache.org/confluence/display/KAFKA/Contributing+Code+Changes) pages.
*Before opening a pull request*, review the [Contributing](https://kafka.apache.org/contributing.html) and [Contributing Code Changes](https://cwiki.apache.org/confluence/display/KAFKA/Contributing+Code+Changes) pages.

It lists steps that are required before creating a PR.

Expand Down
2 changes: 1 addition & 1 deletion NOTICE
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ Apache Kafka
Copyright 2019 The Apache Software Foundation.

This product includes software developed at
The Apache Software Foundation (http://www.apache.org/).
The Apache Software Foundation (https://www.apache.org/).

This distribution has a binary dependency on jersey, which is available under the CDDL
License. The source code of jersey can be found at https://github.com/jersey/jersey/.
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
Apache Kafka
=================
See our [web site](http://kafka.apache.org) for details on the project.
See our [web site](https://kafka.apache.org) for details on the project.

You need to have [Gradle](http://www.gradle.org/installation) and [Java](http://www.oracle.com/technetwork/java/javase/downloads/index.html) installed.
You need to have [Gradle](https://www.gradle.org/installation) and [Java](https://www.oracle.com/technetwork/java/javase/downloads/index.html) installed.

Kafka requires Gradle 5.0 or higher.

Expand All @@ -19,7 +19,7 @@ Now everything else will work.
### Build a jar and run it ###
./gradlew jar

Follow instructions in http://kafka.apache.org/documentation.html#quickstart
Follow instructions in https://kafka.apache.org/documentation.html#quickstart

### Build source jar ###
./gradlew srcJar
Expand Down Expand Up @@ -209,4 +209,4 @@ See [vagrant/README.md](vagrant/README.md).
Apache Kafka is interested in building the community; we would welcome any thoughts or [patches](https://issues.apache.org/jira/browse/KAFKA). You can reach us [on the Apache mailing lists](http://kafka.apache.org/contact.html).

To contribute follow the instructions here:
* http://kafka.apache.org/contributing.html
* https://kafka.apache.org/contributing.html
8 changes: 4 additions & 4 deletions build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -189,11 +189,11 @@ subprojects {
pom.project {
name 'Apache Kafka'
packaging 'jar'
url 'http://kafka.apache.org'
url 'https://kafka.apache.org'
licenses {
license {
name 'The Apache Software License, Version 2.0'
url 'http://www.apache.org/licenses/LICENSE-2.0.txt'
url 'https://www.apache.org/licenses/LICENSE-2.0.txt'
distribution 'repo'
}
}
Expand Down Expand Up @@ -1420,7 +1420,7 @@ project(':connect:api') {

javadoc {
include "**/org/apache/kafka/connect/**" // needed for the `javadocAll` task
options.links "http://docs.oracle.com/javase/7/docs/api/"
options.links "https://docs.oracle.com/javase/8/docs/api/"
}

tasks.create(name: "copyDependantLibs", type: Copy) {
Expand Down Expand Up @@ -1699,5 +1699,5 @@ task aggregatedJavadoc(type: Javadoc) {
classpath = files(projectsWithJavadoc.collect { it.sourceSets.main.compileClasspath })
includes = projectsWithJavadoc.collectMany { it.javadoc.getIncludes() }
excludes = projectsWithJavadoc.collectMany { it.javadoc.getExcludes() }
options.links "http://docs.oracle.com/javase/7/docs/api/"
options.links "https://docs.oracle.com/javase/8/docs/api/"
}
20 changes: 10 additions & 10 deletions doap_Kafka.rdf
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@
<?xml-stylesheet type="text/xsl"?>
<rdf:RDF xml:lang="en"
xmlns="http://usefulinc.com/ns/doap#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:asfext="http://projects.apache.org/ns/asfext#"
xmlns:foaf="http://xmlns.com/foaf/0.1/">
xmlns:rdf="https://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:asfext="https://projects.apache.org/ns/asfext#"
xmlns:foaf="https://xmlns.com/foaf/0.1/">
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
Expand All @@ -21,22 +21,22 @@
See the License for the specific language governing permissions and
limitations under the License.
-->
<Project rdf:about="http://kafka.apache.org/">
<Project rdf:about="https://kafka.apache.org/">
<created>2014-04-12</created>
<license rdf:resource="http://usefulinc.com/doap/licenses/asl20" />
<name>Apache Kafka</name>
<homepage rdf:resource="http://kafka.apache.org/" />
<asfext:pmc rdf:resource="http://kafka.apache.org" />
<homepage rdf:resource="https://kafka.apache.org/" />
<asfext:pmc rdf:resource="https://kafka.apache.org" />
<shortdesc>Apache Kafka is a distributed, fault tolerant, publish-subscribe messaging.</shortdesc>
<description>A single Kafka broker can handle hundreds of megabytes of reads and writes per second from thousands of clients. Kafka is designed to allow a single cluster to serve as the central data backbone for a large organization. It can be elastically and transparently expanded without downtime. Data streams are partitioned and spread over a cluster of machines to allow data streams larger than the capability of any single machine and to allow clusters of co-ordinated consumers. Kafka has a modern cluster-centric design that offers strong durability and fault-tolerance guarantees. Messages are persisted on disk and replicated within the cluster to prevent data loss. Each broker can handle terabytes of messages without performance impact.</description>
<bug-database rdf:resource="https://issues.apache.org/jira/browse/KAFKA" />
<mailing-list rdf:resource="http://kafka.apache.org/contact.html" />
<download-page rdf:resource="http://kafka.apache.org/downloads.html" />
<mailing-list rdf:resource="https://kafka.apache.org/contact.html" />
<download-page rdf:resource="https://kafka.apache.org/downloads.html" />
<programming-language>Scala</programming-language>
<category rdf:resource="http://projects.apache.org/category/big-data" />
<category rdf:resource="https://projects.apache.org/projects.html?category#big-data" />
<repository>
<SVNRepository>
<location rdf:resource="http://git-wip-us.apache.org/repos/asf/kafka.git"/>
<location rdf:resource="https://gitbox.apache.org/repos/asf/kafka.git"/>
<browse rdf:resource="https://github.com/apache/kafka"/>
</SVNRepository>
</repository>
Expand Down
2 changes: 1 addition & 1 deletion gradle/buildscript.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ repositories {
repositories {
// For license plugin.
maven {
url 'http://dl.bintray.com/content/netflixoss/external-gradle-plugins/'
url 'https://dl.bintray.com/content/netflixoss/external-gradle-plugins/'
}
}
}
4 changes: 2 additions & 2 deletions jmh-benchmarks/README.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
###JMH-Benchmark module

This module contains benchmarks written using [JMH](http://openjdk.java.net/projects/code-tools/jmh/) from OpenJDK.
This module contains benchmarks written using [JMH](https://openjdk.java.net/projects/code-tools/jmh/) from OpenJDK.
Writing correct micro-benchmarks is Java (or another JVM language) is difficult and there are many non-obvious pitfalls (many
due to compiler optimizations). JMH is a framework for running and analyzing benchmarks (micro or macro) written in Java (or
another JVM language).

For help in writing correct JMH tests, the best place to start is the [sample code](http://hg.openjdk.java.net/code-tools/jmh/file/tip/jmh-samples/src/main/java/org/openjdk/jmh/samples/) provided
For help in writing correct JMH tests, the best place to start is the [sample code](https://hg.openjdk.java.net/code-tools/jmh/file/tip/jmh-samples/src/main/java/org/openjdk/jmh/samples/) provided
by the JMH project.

Typically, JMH is expected to run as a separate project in Maven. The jmh-benchmarks module uses
Expand Down
6 changes: 3 additions & 3 deletions release_notes.py
Original file line number Diff line number Diff line change
Expand Up @@ -98,16 +98,16 @@ def issue_type_key(issue):

print "<h1>Release Notes - Kafka - Version %s</h1>" % version
print """<p>Below is a summary of the JIRA issues addressed in the %(version)s release of Kafka. For full documentation of the
release, a guide to get started, and information about the project, see the <a href="http://kafka.apache.org/">Kafka
release, a guide to get started, and information about the project, see the <a href="https://kafka.apache.org/">Kafka
project site</a>.</p>
<p><b>Note about upgrades:</b> Please carefully review the
<a href="http://kafka.apache.org/%(minor)s/documentation.html#upgrade">upgrade documentation</a> for this release thoroughly
<a href="https://kafka.apache.org/%(minor)s/documentation.html#upgrade">upgrade documentation</a> for this release thoroughly
before upgrading your cluster. The upgrade notes discuss any critical information about incompatibilities and breaking
changes, performance changes, and any other changes that might impact your production deployment of Kafka.</p>
<p>The documentation for the most recent release can be found at
<a href="http://kafka.apache.org/documentation.html">http://kafka.apache.org/documentation.html</a>.</p>""" % { 'version': version, 'minor': minor_version_dotless }
<a href="https://kafka.apache.org/documentation.html">https://kafka.apache.org/documentation.html</a>.</p>""" % { 'version': version, 'minor': minor_version_dotless }
for itype, issues in by_group:
print "<h2>%s</h2>" % itype
print "<ul>"
Expand Down
6 changes: 3 additions & 3 deletions tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -357,7 +357,7 @@ For a tutorial on how to setup and run the Kafka system tests, see
https://cwiki.apache.org/confluence/display/KAFKA/tutorial+-+set+up+and+run+Kafka+system+tests+with+ducktape

* Install Virtual Box from [https://www.virtualbox.org/](https://www.virtualbox.org/) (run `$ vboxmanage --version` to check if it's installed).
* Install Vagrant >= 1.6.4 from [http://www.vagrantup.com/](http://www.vagrantup.com/) (run `vagrant --version` to check if it's installed).
* Install Vagrant >= 1.6.4 from [https://www.vagrantup.com/](https://www.vagrantup.com/) (run `vagrant --version` to check if it's installed).
* Install system test dependencies, including ducktape, a command-line tool and library for testing distributed systems. We recommend to use virtual env for system test development

$ cd kafka/tests
Expand Down Expand Up @@ -401,12 +401,12 @@ Preparation
In these steps, we will create an IAM role which has permission to create and destroy EC2 instances,
set up a keypair used for ssh access to the test driver and worker machines, and create a security group to allow the test driver and workers to all communicate via TCP.

* [Create an IAM role](http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-user.html). We'll give this role the ability to launch or kill additional EC2 machines.
* [Create an IAM role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-user.html). We'll give this role the ability to launch or kill additional EC2 machines.
- Create role "kafkatest-master"
- Role type: Amazon EC2
- Attach policy: AmazonEC2FullAccess (this will allow our test-driver to create and destroy EC2 instances)

* If you haven't already, [set up a keypair to use for SSH access](http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-key-pairs.html). For the purpose
* If you haven't already, [set up a keypair to use for SSH access](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-key-pairs.html). For the purpose
of this quickstart, let's say the keypair name is kafkatest, and you've saved the private key in kafktest.pem

* Next, create a EC2 security group called "kafkatest".
Expand Down
2 changes: 1 addition & 1 deletion tests/bootstrap-test-env.sh
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ echo "Checking Vagrant installation..."
vagrant_version=`vagrant --version | egrep -o "[0-9]+\.[0-9]+\.[0-9]+"`
bad_vagrant=false
if [ "$(version $vagrant_version)" -lt "$(version 1.6.4)" ]; then
echo "Found Vagrant version $vagrant_version. Please upgrade to 1.6.4 or higher (see http://www.vagrantup.com for details)"
echo "Found Vagrant version $vagrant_version. Please upgrade to 1.6.4 or higher (see https://www.vagrantup.com for details)"
bad_vagrant=true
else
echo "Vagrant installation looks good."
Expand Down
2 changes: 1 addition & 1 deletion vagrant/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
Using Vagrant to get up and running.

1) Install Virtual Box [https://www.virtualbox.org/](https://www.virtualbox.org/)
2) Install Vagrant >= 1.6.4 [http://www.vagrantup.com/](http://www.vagrantup.com/)
2) Install Vagrant >= 1.6.4 [https://www.vagrantup.com/](https://www.vagrantup.com/)
3) Install Vagrant Plugins:

$ vagrant plugin install vagrant-hostmanager
Expand Down

0 comments on commit 7d9e93a

Please sign in to comment.