Skip to content

Update to version v1.0.1 #25

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 3, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Steps to reproduce the behavior.
A clear and concise description of what you expected to happen.

**Please complete the following information about the solution:**
- [ ] Version: [e.g. v1.0.0]
- [ ] Version: [e.g. v1.0.1]

To get the version of the solution, you can look at the description of the created CloudFormation stack. For example, "_(SO0021) - Video On Demand workflow with AWS Step Functions, MediaConvert, MediaPackage, S3, CloudFront and DynamoDB. Version **v5.0.0**_". If the description does not contain the version information, you can look at the mappings section of the template:

Expand Down
36 changes: 36 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,45 @@
# Change Log

All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [1.0.1] - 2024-07-01

### Fixed

- Fix the outdated segmentCache selection strategy runtime config [#11](https://github.com/aws-solutions/scalable-analytics-using-apache-druid-on-aws/pull/11)
- Fix log/metrics endpoints when fips enabled [#14](https://github.com/aws-solutions/scalable-analytics-using-apache-druid-on-aws/pull/14)

### Added

- allow solution to config internal system [#7](https://github.com/aws-solutions/scalable-analytics-using-apache-druid-on-aws/pull/7)
- Update zk netplan render to handle docker bridge network interface[#8](https://github.com/aws-solutions/scalable-analytics-using-apache-druid-on-aws/pull/8)
- add support to define custom oidc scopes [#9](https://github.com/aws-solutions/scalable-analytics-using-apache-druid-on-aws/pull/9)
- Bump CloudWatch Synthetics runtime version [#10](https://github.com/aws-solutions/scalable-analytics-using-apache-druid-on-aws/pull/10)
- Add vpc to all lambdas, allow users to self manage install bucket assets [#15](https://github.com/aws-solutions/scalable-analytics-using-apache-druid-on-aws/pull/15)
- setup nvme disk for data/historical/middlemanager [#16](https://github.com/aws-solutions/scalable-analytics-using-apache-druid-on-aws/pull/16)
- Use proper cfn endpoint, update name tag to include tier [#22](https://github.com/aws-solutions/scalable-analytics-using-apache-druid-on-aws/pull/22)
- adding graceful shutdown for druid process [#23](https://github.com/aws-solutions/scalable-analytics-using-apache-druid-on-aws/pull/23)

### Changed

- for pac4j version change: OidcAuthenticator.java, OidcConfig.java, OidcFilter.java, OidcSessionStore.java
- ec2 user data for provisioning changes
- deprecated RDS certificate name changed from RDS_CA_RDS2048_G1 to RDS_CA_RSA2048_G1
- deprecated CloudWatch VPC endpoint name changed from CLOUDWATCH to CLOUDWATCH_MONITORING
- README instructions
- cdk version updated to 2.146.0
- Druid release to 29.0.1
- braces package to 3.0.3 due to CVE-2024-4068
- unit test improvements
- pac4j package to 4.5.7 due to CVE-2021-44878
- druid-oidc to 29.0.1
- guava to 32.0.0-jre due to CVE-2023-2976

## [1.0.0] - 2024-01-09

### Added

- All files, initial version
175 changes: 0 additions & 175 deletions LICENSE

This file was deleted.

1 change: 0 additions & 1 deletion NOTICE

This file was deleted.

2 changes: 1 addition & 1 deletion NOTICE.txt
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ This software includes third party software subject to the following copyrights:

./source
==========
@aws-cdk/lambda-layer-kubectl-v23@2.0.8 Apache-2.0
@aws-cdk/lambda-layer-kubectl-v29@2.0.0 Apache-2.0
@aws-cdk/[email protected] Apache-2.0
@aws-sdk/[email protected] Apache-2.0
@aws-sdk/[email protected] Apache-2.0
Expand Down
6 changes: 3 additions & 3 deletions deployment/build-s3-dist.sh
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
# Parameters:
# - source-bucket-base-name: Name for the S3 bucket location where the template will source the Lambda
# code from. The template will append '-[region_name]' to this bucket name.
# For example: ./build-s3-dist.sh solutions v1.0.0
# For example: ./build-s3-dist.sh solutions v1.0.1
# The template will then expect the source code to be located in the solutions-[region_name] bucket
# - solution-name: name of the solution for consistency
# - version-code: version of the package
Expand All @@ -33,7 +33,7 @@ normal=$(tput sgr0)
# SETTINGS
#------------------------------------------------------------------------------
# Important: CDK global version number
cdk_version=2.115.0
cdk_version=2.140.0
# Note: should match package.json
template_format="json"
run_helper="false"
Expand All @@ -60,7 +60,7 @@ usage()
{
echo "Usage: $0 bucket solution-name version"
echo "Please provide the base source bucket name, trademarked solution name, and version."
echo "For example: ./build-s3-dist.sh mybucket my-solution v1.0.0"
echo "For example: ./build-s3-dist.sh mybucket my-solution v1.0.1"
exit 1
}

Expand Down
6 changes: 3 additions & 3 deletions source/DruidCloudwatchExtension/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.20</version>
<version>1.18.30</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-cloudwatch -->
<dependency>
Expand All @@ -37,7 +37,7 @@
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<version>4.13.1</version>
<scope>test</scope>
</dependency>
<dependency>
Expand Down Expand Up @@ -157,4 +157,4 @@
</plugin>
</plugins>
</build>
</project>
</project>
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,8 @@ public class CloudwatchEmitter implements Emitter {

private final ObjectMapper jsonMapper;

private final CloudwatchEmitterConfig config;
// set to nosonar because it causes a false positive
private final CloudwatchEmitterConfig config; // NOSONAR

private final DruidMonitoringMetricsFactory druidMonitoringMetricsFactory;

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
@Data
public class CloudwatchEmitterConfig {
static final int CLOUDWATCH_METRICS_MEMORY_LIMIT = 100000000;
static final String SOLUTION_VERSION = "v1.0.0";
static final String SOLUTION_VERSION = "v1.0.1";

@JsonProperty("batchSize")
@Nullable
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,15 +57,15 @@ public void testGetBatchSize_withDefaultBatchSize() {
@Test
public void testToString_withValidConfig() {
// arrange
CloudwatchEmitterConfig config = new CloudwatchEmitterConfig("test-cluster", 200, "v1.0.0");
CloudwatchEmitterConfig config = new CloudwatchEmitterConfig("test-cluster", 200, "v1.0.1");

// act
String actual = config.toString();

// assert
Assert.assertTrue(actual.contains("test-cluster"));
Assert.assertTrue(actual.contains("200"));
Assert.assertTrue(actual.contains("v1.0.0"));
Assert.assertTrue(actual.contains("v1.0.1"));
Assert.assertTrue(actual.contains("CloudwatchEmitterConfig"));
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,13 +30,17 @@
import org.mockito.Mock;
import org.mockito.MockitoAnnotations;
import org.mockito.Spy;

import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicLong;
import com.amazonaws.services.cloudwatch.model.PutMetricDataRequest;
import com.amazonaws.services.cloudwatch.model.StandardUnit;
import static org.junit.Assert.*;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.anyLong;
import static org.mockito.Mockito.*;

public class CloudwatchEmitterTest {
Expand Down Expand Up @@ -95,8 +99,17 @@ public void testSendMetricToCloudwatch() throws InterruptedException {
eventMetricDatum.setMetricName("event-metric");
eventMetricDatum.setValue(1.0);
eventMetricDatum.setUnit(StandardUnit.Count);

Dimension dimensionEventMetric = new Dimension();
dimensionEventMetric.setName("test-dimension");
dimensionEventMetric.setValue("test-value");
List<Dimension> dimensionsEventMetric = new ArrayList<>();
dimensionsEventMetric.add(dimensionEventMetric);

eventMetricDatum.setDimensions(
List.of(new Dimension().withName("test-dimension").withValue("test-value")));
dimensionsEventMetric
);

ObjectContainer<MetricDatum> eventMetricContainer =
emitter.getObjectContainer(eventMetricDatum);
metricQueue.offer(eventMetricContainer);
Expand All @@ -105,8 +118,15 @@ public void testSendMetricToCloudwatch() throws InterruptedException {
alertMetricDatum.setMetricName("alert-metric");
alertMetricDatum.setValue(1.0);
alertMetricDatum.setUnit(StandardUnit.Count);

Dimension dimensionAlertMetric = new Dimension();
dimensionAlertMetric.setName("test-dimension");
dimensionAlertMetric.setValue("test-value");
List<Dimension> dimensionsAlertMetric = new ArrayList<>();
dimensionsAlertMetric.add(dimensionEventMetric);

alertMetricDatum.setDimensions(
List.of(new Dimension().withName("test-dimension").withValue("test-value")));
dimensionsAlertMetric);
ObjectContainer<MetricDatum> alertMetricContainer =
emitter.getObjectContainer(alertMetricDatum);
alertQueue.offer(alertMetricContainer);
Expand Down
Loading