mirror of
https://github.com/prometheus/prometheus.git
synced 2025-08-06 22:27:17 +02:00
Merge pull request #16870 from prometheus/merge-3.5-into-main
Merge release-3.5 into main
This commit is contained in:
commit
a9438f020d
43
CHANGELOG.md
43
CHANGELOG.md
@ -2,9 +2,50 @@
|
|||||||
|
|
||||||
## main / unreleased
|
## main / unreleased
|
||||||
|
|
||||||
* [FEATURE] OTLP receiver: Support promoting OTel scope name/version/schema URL/attributes as metric labels, enable via configuration parameter `otlp.promote_scope_metadata`. #16730 #16760
|
|
||||||
* [BUGFIX] OTLP receiver: Generate `target_info` samples between the earliest and latest samples per resource. #16737
|
* [BUGFIX] OTLP receiver: Generate `target_info` samples between the earliest and latest samples per resource. #16737
|
||||||
|
|
||||||
|
## 3.5.0 / 2025-07-14
|
||||||
|
|
||||||
|
* [FEATURE] PromQL: Add experimental type and unit metadata labels, behind feature flag `type-and-unit-labels`. #16228 #16632 #16718 #16743
|
||||||
|
* [FEATURE] PromQL: Add `ts_of_(min|max|last)_over_time`, behind feature flag `experimental-promql-functions`. #16722 #16733
|
||||||
|
* [FEATURE] Scraping: Add global option `always_scrape_classic_histograms` to scrape a classic histogram even if it is also exposed as native. #16452
|
||||||
|
* [FEATURE] OTLP: New config options `promote_all_resource_attributes` and `ignore_resource_attributes`. #16426
|
||||||
|
* [FEATURE] Discovery: New service discovery for STACKIT Cloud. #16401
|
||||||
|
* [ENHANCEMENT] Hetzner SD: Add `label_selector` to filter servers. #16512
|
||||||
|
* [ENHANCEMENT] PromQL: support non-constant parameter in aggregations like `quantile` and `topk`. #16404
|
||||||
|
* [ENHANCEMENT] UI: Better total target count display when using `keep_dropped_targets` option. #16604
|
||||||
|
* [ENHANCEMENT] UI: Add simple filtering on the `/rules` page. #16605
|
||||||
|
* [ENHANCEMENT] UI: Display query stats in hover tooltip over table query tab. #16723
|
||||||
|
* [ENHANCEMENT] UI: Clear search field on `/targets` page. #16567
|
||||||
|
* [ENHANCEMENT] Rules: Check that rules parse without error earlier at startup. #16601
|
||||||
|
* [ENHANCEMENT] Promtool: Optional fuzzy float64 comparison in rules unittests. #16395
|
||||||
|
* [PERF] PromQL: Reuse `histogramStatsIterator` where possible. #16686
|
||||||
|
* [PERF] PromQL: Reuse storage for custom bucket values for native histograms. #16565
|
||||||
|
* [PERF] UI: Optimize memoization and search debouncing on `/targets` page. #16589
|
||||||
|
* [PERF] UI: Fix full-page re-rendering when opening status nav menu. #16590
|
||||||
|
* [PERF] Kubernetes SD: use service cache.Indexer to achieve better performance. #16365
|
||||||
|
* [PERF] TSDB: Optionally use Direct IO for chunks writing. #15365
|
||||||
|
* [PERF] TSDB: When fetching label values, stop work earlier if the limit is reached. #16158
|
||||||
|
* [PERF] Labels: Simpler/faster stringlabels encoding. #16069
|
||||||
|
* [PERF] Scraping: Reload scrape pools concurrently. #16595 #16783
|
||||||
|
* [BUGFIX] Top-level: Update GOGC before loading TSDB. #16491
|
||||||
|
* [BUGFIX] Config: Respect GOGC environment variable if no "runtime" block exists. #16558
|
||||||
|
* [BUGFIX] PromQL: Fix native histogram `last_over_time`. #16744
|
||||||
|
* [BUGFIX] PromQL: Fix reported parser position range in errors for aggregations wrapped in ParenExpr #16041 #16754
|
||||||
|
* [BUGFIX] PromQL: Don't emit a value from `histogram_fraction` or `histogram_quantile` if classic and native histograms are present at the same timestamp. #16552
|
||||||
|
* [BUGFIX] PromQL: Incorrect rounding of `[1001ms]` to `[1s]` and similar. #16478
|
||||||
|
* [BUGFIX] PromQL: Fix inconsistent / sometimes negative `histogram_count` and `histogram_sum`. #16682
|
||||||
|
* [BUGFIX] PromQL: Improve handling of NaNs in native histograms. #16724
|
||||||
|
* [BUGFIX] PromQL: Fix unary operator precedence in duration expressions. #16713
|
||||||
|
* [BUGFIX] PromQL: Improve consistency of `avg` aggregation and `avg_over_time`. #16569 #16773
|
||||||
|
* [BUGFIX] UI: Add query warnings and info to graph view. #16753 #16759
|
||||||
|
* [BUGFIX] API: Add HTTP `Vary: Origin` header to responses to avoid cache poisoning. #16008
|
||||||
|
* [BUGFIX] Discovery: Avoid deadlocks by taking locks in consistent order. #16587
|
||||||
|
* [BUGFIX] Remote-write: For Azure AD auth, allow empty `client_id` to suppport system assigned managed identity. #16421
|
||||||
|
* [BUGFIX] Scraping: Fix rare memory corruption bug. #16623
|
||||||
|
* [BUGFIX] Scraping: continue handling custom-bucket histograms after an exponential histogram is encountered. #16720
|
||||||
|
* [BUGFIX] OTLP: Default config not respected when `otlp:` block is unset. #16693
|
||||||
|
|
||||||
## 3.4.2 / 2025-06-26
|
## 3.4.2 / 2025-06-26
|
||||||
|
|
||||||
* [BUGFIX] OTLP receiver: Fix default configuration not being respected if the `otlp:` block is unset in the config file. #16693
|
* [BUGFIX] OTLP receiver: Fix default configuration not being respected if the `otlp:` block is unset in the config file. #16693
|
||||||
|
@ -1604,9 +1604,6 @@ type OTLPConfig struct {
|
|||||||
TranslationStrategy translationStrategyOption `yaml:"translation_strategy,omitempty"`
|
TranslationStrategy translationStrategyOption `yaml:"translation_strategy,omitempty"`
|
||||||
KeepIdentifyingResourceAttributes bool `yaml:"keep_identifying_resource_attributes,omitempty"`
|
KeepIdentifyingResourceAttributes bool `yaml:"keep_identifying_resource_attributes,omitempty"`
|
||||||
ConvertHistogramsToNHCB bool `yaml:"convert_histograms_to_nhcb,omitempty"`
|
ConvertHistogramsToNHCB bool `yaml:"convert_histograms_to_nhcb,omitempty"`
|
||||||
// PromoteScopeMetadata controls whether to promote OTel scope metadata (i.e. name, version, schema URL, and attributes) to metric labels.
|
|
||||||
// As per OTel spec, the aforementioned scope metadata should be identifying, i.e. made into metric labels.
|
|
||||||
PromoteScopeMetadata bool `yaml:"promote_scope_metadata,omitempty"`
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// UnmarshalYAML implements the yaml.Unmarshaler interface.
|
// UnmarshalYAML implements the yaml.Unmarshaler interface.
|
||||||
|
@ -1808,20 +1808,6 @@ func TestOTLPConvertHistogramsToNHCB(t *testing.T) {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestOTLPPromoteScopeMetadata(t *testing.T) {
|
|
||||||
t.Run("good config", func(t *testing.T) {
|
|
||||||
want, err := LoadFile(filepath.Join("testdata", "otlp_promote_scope_metadata.good.yml"), false, promslog.NewNopLogger())
|
|
||||||
require.NoError(t, err)
|
|
||||||
|
|
||||||
out, err := yaml.Marshal(want)
|
|
||||||
require.NoError(t, err)
|
|
||||||
var got Config
|
|
||||||
require.NoError(t, yaml.UnmarshalStrict(out, &got))
|
|
||||||
|
|
||||||
require.True(t, got.OTLPConfig.PromoteScopeMetadata)
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestOTLPAllowUTF8(t *testing.T) {
|
func TestOTLPAllowUTF8(t *testing.T) {
|
||||||
t.Run("good config - NoUTF8EscapingWithSuffixes", func(t *testing.T) {
|
t.Run("good config - NoUTF8EscapingWithSuffixes", func(t *testing.T) {
|
||||||
fpath := filepath.Join("testdata", "otlp_allow_utf8.good.yml")
|
fpath := filepath.Join("testdata", "otlp_allow_utf8.good.yml")
|
||||||
|
@ -1,2 +0,0 @@
|
|||||||
otlp:
|
|
||||||
promote_scope_metadata: true
|
|
@ -219,9 +219,6 @@ otlp:
|
|||||||
[ keep_identifying_resource_attributes: <boolean> | default = false ]
|
[ keep_identifying_resource_attributes: <boolean> | default = false ]
|
||||||
# Configures optional translation of OTLP explicit bucket histograms into native histograms with custom buckets.
|
# Configures optional translation of OTLP explicit bucket histograms into native histograms with custom buckets.
|
||||||
[ convert_histograms_to_nhcb: <boolean> | default = false ]
|
[ convert_histograms_to_nhcb: <boolean> | default = false ]
|
||||||
# Enables promotion of OTel scope metadata (i.e. name, version, schema URL, and attributes) to metric labels.
|
|
||||||
# This is disabled by default for backwards compatibility, but according to OTel spec, scope metadata _should_ be identifying, i.e. translated to metric labels.
|
|
||||||
[ promote_scope_metadata: <boolean> | default = false ]
|
|
||||||
|
|
||||||
# Settings related to the remote read feature.
|
# Settings related to the remote read feature.
|
||||||
remote_read:
|
remote_read:
|
||||||
|
@ -117,7 +117,7 @@ var seps = []byte{'\xff'}
|
|||||||
// Unpaired string values are ignored. String pairs overwrite OTLP labels if collisions happen and
|
// Unpaired string values are ignored. String pairs overwrite OTLP labels if collisions happen and
|
||||||
// if logOnOverwrite is true, the overwrite is logged. Resulting label names are sanitized.
|
// if logOnOverwrite is true, the overwrite is logged. Resulting label names are sanitized.
|
||||||
// If settings.PromoteResourceAttributes is not empty, it's a set of resource attributes that should be promoted to labels.
|
// If settings.PromoteResourceAttributes is not empty, it's a set of resource attributes that should be promoted to labels.
|
||||||
func createAttributes(resource pcommon.Resource, attributes pcommon.Map, scope scope, settings Settings,
|
func createAttributes(resource pcommon.Resource, attributes pcommon.Map, settings Settings,
|
||||||
ignoreAttrs []string, logOnOverwrite bool, extras ...string,
|
ignoreAttrs []string, logOnOverwrite bool, extras ...string,
|
||||||
) []prompb.Label {
|
) []prompb.Label {
|
||||||
resourceAttrs := resource.Attributes()
|
resourceAttrs := resource.Attributes()
|
||||||
@ -126,14 +126,8 @@ func createAttributes(resource pcommon.Resource, attributes pcommon.Map, scope s
|
|||||||
|
|
||||||
promotedAttrs := settings.PromoteResourceAttributes.promotedAttributes(resourceAttrs)
|
promotedAttrs := settings.PromoteResourceAttributes.promotedAttributes(resourceAttrs)
|
||||||
|
|
||||||
promoteScope := settings.PromoteScopeMetadata && scope.name != ""
|
|
||||||
scopeLabelCount := 0
|
|
||||||
if promoteScope {
|
|
||||||
// Include name, version and schema URL.
|
|
||||||
scopeLabelCount = scope.attributes.Len() + 3
|
|
||||||
}
|
|
||||||
// Calculate the maximum possible number of labels we could return so we can preallocate l.
|
// Calculate the maximum possible number of labels we could return so we can preallocate l.
|
||||||
maxLabelCount := attributes.Len() + len(settings.ExternalLabels) + len(promotedAttrs) + scopeLabelCount + len(extras)/2
|
maxLabelCount := attributes.Len() + len(settings.ExternalLabels) + len(promotedAttrs) + len(extras)/2
|
||||||
|
|
||||||
if haveServiceName {
|
if haveServiceName {
|
||||||
maxLabelCount++
|
maxLabelCount++
|
||||||
@ -173,17 +167,6 @@ func createAttributes(resource pcommon.Resource, attributes pcommon.Map, scope s
|
|||||||
l[normalized] = lbl.Value
|
l[normalized] = lbl.Value
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if promoteScope {
|
|
||||||
l["otel_scope_name"] = scope.name
|
|
||||||
l["otel_scope_version"] = scope.version
|
|
||||||
l["otel_scope_schema_url"] = scope.schemaURL
|
|
||||||
scope.attributes.Range(func(k string, v pcommon.Value) bool {
|
|
||||||
name := "otel_scope_" + k
|
|
||||||
name = labelNamer.Build(name)
|
|
||||||
l[name] = v.AsString()
|
|
||||||
return true
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
// Map service.name + service.namespace to job.
|
// Map service.name + service.namespace to job.
|
||||||
if haveServiceName {
|
if haveServiceName {
|
||||||
@ -254,7 +237,7 @@ func aggregationTemporality(metric pmetric.Metric) (pmetric.AggregationTemporali
|
|||||||
// However, work is under way to resolve this shortcoming through a feature called native histograms custom buckets:
|
// However, work is under way to resolve this shortcoming through a feature called native histograms custom buckets:
|
||||||
// https://github.com/prometheus/prometheus/issues/13485.
|
// https://github.com/prometheus/prometheus/issues/13485.
|
||||||
func (c *PrometheusConverter) addHistogramDataPoints(ctx context.Context, dataPoints pmetric.HistogramDataPointSlice,
|
func (c *PrometheusConverter) addHistogramDataPoints(ctx context.Context, dataPoints pmetric.HistogramDataPointSlice,
|
||||||
resource pcommon.Resource, settings Settings, baseName string, scope scope,
|
resource pcommon.Resource, settings Settings, baseName string,
|
||||||
) error {
|
) error {
|
||||||
for x := 0; x < dataPoints.Len(); x++ {
|
for x := 0; x < dataPoints.Len(); x++ {
|
||||||
if err := c.everyN.checkContext(ctx); err != nil {
|
if err := c.everyN.checkContext(ctx); err != nil {
|
||||||
@ -263,7 +246,7 @@ func (c *PrometheusConverter) addHistogramDataPoints(ctx context.Context, dataPo
|
|||||||
|
|
||||||
pt := dataPoints.At(x)
|
pt := dataPoints.At(x)
|
||||||
timestamp := convertTimeStamp(pt.Timestamp())
|
timestamp := convertTimeStamp(pt.Timestamp())
|
||||||
baseLabels := createAttributes(resource, pt.Attributes(), scope, settings, nil, false)
|
baseLabels := createAttributes(resource, pt.Attributes(), settings, nil, false)
|
||||||
|
|
||||||
// If the sum is unset, it indicates the _sum metric point should be
|
// If the sum is unset, it indicates the _sum metric point should be
|
||||||
// omitted
|
// omitted
|
||||||
@ -464,7 +447,7 @@ func findMinAndMaxTimestamps(metric pmetric.Metric, minTimestamp, maxTimestamp p
|
|||||||
}
|
}
|
||||||
|
|
||||||
func (c *PrometheusConverter) addSummaryDataPoints(ctx context.Context, dataPoints pmetric.SummaryDataPointSlice, resource pcommon.Resource,
|
func (c *PrometheusConverter) addSummaryDataPoints(ctx context.Context, dataPoints pmetric.SummaryDataPointSlice, resource pcommon.Resource,
|
||||||
settings Settings, baseName string, scope scope,
|
settings Settings, baseName string,
|
||||||
) error {
|
) error {
|
||||||
for x := 0; x < dataPoints.Len(); x++ {
|
for x := 0; x < dataPoints.Len(); x++ {
|
||||||
if err := c.everyN.checkContext(ctx); err != nil {
|
if err := c.everyN.checkContext(ctx); err != nil {
|
||||||
@ -473,7 +456,7 @@ func (c *PrometheusConverter) addSummaryDataPoints(ctx context.Context, dataPoin
|
|||||||
|
|
||||||
pt := dataPoints.At(x)
|
pt := dataPoints.At(x)
|
||||||
timestamp := convertTimeStamp(pt.Timestamp())
|
timestamp := convertTimeStamp(pt.Timestamp())
|
||||||
baseLabels := createAttributes(resource, pt.Attributes(), scope, settings, nil, false)
|
baseLabels := createAttributes(resource, pt.Attributes(), settings, nil, false)
|
||||||
|
|
||||||
// treat sum as a sample in an individual TimeSeries
|
// treat sum as a sample in an individual TimeSeries
|
||||||
sum := &prompb.Sample{
|
sum := &prompb.Sample{
|
||||||
@ -626,7 +609,7 @@ func addResourceTargetInfo(resource pcommon.Resource, settings Settings, earlies
|
|||||||
// Do not pass identifying attributes as ignoreAttrs below.
|
// Do not pass identifying attributes as ignoreAttrs below.
|
||||||
identifyingAttrs = nil
|
identifyingAttrs = nil
|
||||||
}
|
}
|
||||||
labels := createAttributes(resource, attributes, scope{}, settings, identifyingAttrs, false, model.MetricNameLabel, name)
|
labels := createAttributes(resource, attributes, settings, identifyingAttrs, false, model.MetricNameLabel, name)
|
||||||
haveIdentifier := false
|
haveIdentifier := false
|
||||||
for _, l := range labels {
|
for _, l := range labels {
|
||||||
if l.Name == model.JobLabel || l.Name == model.InstanceLabel {
|
if l.Name == model.JobLabel || l.Name == model.InstanceLabel {
|
||||||
|
@ -43,17 +43,6 @@ func TestCreateAttributes(t *testing.T) {
|
|||||||
// This one is for testing conflict with auto-generated instance attribute.
|
// This one is for testing conflict with auto-generated instance attribute.
|
||||||
"instance": "resource value",
|
"instance": "resource value",
|
||||||
}
|
}
|
||||||
scopeAttrs := pcommon.NewMap()
|
|
||||||
scopeAttrs.FromRaw(map[string]any{
|
|
||||||
"attr1": "value1",
|
|
||||||
"attr2": "value2",
|
|
||||||
})
|
|
||||||
defaultScope := scope{
|
|
||||||
name: "test-scope",
|
|
||||||
version: "1.0.0",
|
|
||||||
schemaURL: "https://schema.com",
|
|
||||||
attributes: scopeAttrs,
|
|
||||||
}
|
|
||||||
|
|
||||||
resource := pcommon.NewResource()
|
resource := pcommon.NewResource()
|
||||||
for k, v := range resourceAttrs {
|
for k, v := range resourceAttrs {
|
||||||
@ -65,19 +54,15 @@ func TestCreateAttributes(t *testing.T) {
|
|||||||
|
|
||||||
testCases := []struct {
|
testCases := []struct {
|
||||||
name string
|
name string
|
||||||
scope scope
|
|
||||||
promoteAllResourceAttributes bool
|
promoteAllResourceAttributes bool
|
||||||
promoteResourceAttributes []string
|
promoteResourceAttributes []string
|
||||||
promoteScope bool
|
|
||||||
ignoreResourceAttributes []string
|
ignoreResourceAttributes []string
|
||||||
ignoreAttrs []string
|
ignoreAttrs []string
|
||||||
expectedLabels []prompb.Label
|
expectedLabels []prompb.Label
|
||||||
}{
|
}{
|
||||||
{
|
{
|
||||||
name: "Successful conversion without resource attribute promotion and without scope promotion",
|
name: "Successful conversion without resource attribute promotion",
|
||||||
scope: defaultScope,
|
|
||||||
promoteResourceAttributes: nil,
|
promoteResourceAttributes: nil,
|
||||||
promoteScope: false,
|
|
||||||
expectedLabels: []prompb.Label{
|
expectedLabels: []prompb.Label{
|
||||||
{
|
{
|
||||||
Name: "__name__",
|
Name: "__name__",
|
||||||
@ -102,86 +87,8 @@ func TestCreateAttributes(t *testing.T) {
|
|||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "Successful conversion without resource attribute promotion and with scope promotion",
|
name: "Successful conversion with some attributes ignored",
|
||||||
scope: defaultScope,
|
|
||||||
promoteResourceAttributes: nil,
|
promoteResourceAttributes: nil,
|
||||||
promoteScope: true,
|
|
||||||
expectedLabels: []prompb.Label{
|
|
||||||
{
|
|
||||||
Name: "__name__",
|
|
||||||
Value: "test_metric",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "instance",
|
|
||||||
Value: "service ID",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "job",
|
|
||||||
Value: "service name",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "metric_attr",
|
|
||||||
Value: "metric value",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "metric_attr_other",
|
|
||||||
Value: "metric value other",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_name",
|
|
||||||
Value: defaultScope.name,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_schema_url",
|
|
||||||
Value: defaultScope.schemaURL,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_version",
|
|
||||||
Value: defaultScope.version,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr1",
|
|
||||||
Value: "value1",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr2",
|
|
||||||
Value: "value2",
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: "Successful conversion without resource attribute promotion and with scope promotion, but without scope",
|
|
||||||
scope: scope{},
|
|
||||||
promoteResourceAttributes: nil,
|
|
||||||
promoteScope: true,
|
|
||||||
expectedLabels: []prompb.Label{
|
|
||||||
{
|
|
||||||
Name: "__name__",
|
|
||||||
Value: "test_metric",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "instance",
|
|
||||||
Value: "service ID",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "job",
|
|
||||||
Value: "service name",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "metric_attr",
|
|
||||||
Value: "metric value",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "metric_attr_other",
|
|
||||||
Value: "metric value other",
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: "Successful conversion with some attributes ignored and with scope promotion",
|
|
||||||
scope: defaultScope,
|
|
||||||
promoteResourceAttributes: nil,
|
|
||||||
promoteScope: true,
|
|
||||||
ignoreAttrs: []string{"metric-attr-other"},
|
ignoreAttrs: []string{"metric-attr-other"},
|
||||||
expectedLabels: []prompb.Label{
|
expectedLabels: []prompb.Label{
|
||||||
{
|
{
|
||||||
@ -200,33 +107,11 @@ func TestCreateAttributes(t *testing.T) {
|
|||||||
Name: "metric_attr",
|
Name: "metric_attr",
|
||||||
Value: "metric value",
|
Value: "metric value",
|
||||||
},
|
},
|
||||||
{
|
|
||||||
Name: "otel_scope_name",
|
|
||||||
Value: defaultScope.name,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_schema_url",
|
|
||||||
Value: defaultScope.schemaURL,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_version",
|
|
||||||
Value: defaultScope.version,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr1",
|
|
||||||
Value: "value1",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr2",
|
|
||||||
Value: "value2",
|
|
||||||
},
|
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "Successful conversion with resource attribute promotion and with scope promotion",
|
name: "Successful conversion with resource attribute promotion",
|
||||||
scope: defaultScope,
|
|
||||||
promoteResourceAttributes: []string{"non-existent-attr", "existent-attr"},
|
promoteResourceAttributes: []string{"non-existent-attr", "existent-attr"},
|
||||||
promoteScope: true,
|
|
||||||
expectedLabels: []prompb.Label{
|
expectedLabels: []prompb.Label{
|
||||||
{
|
{
|
||||||
Name: "__name__",
|
Name: "__name__",
|
||||||
@ -252,33 +137,11 @@ func TestCreateAttributes(t *testing.T) {
|
|||||||
Name: "existent_attr",
|
Name: "existent_attr",
|
||||||
Value: "resource value",
|
Value: "resource value",
|
||||||
},
|
},
|
||||||
{
|
|
||||||
Name: "otel_scope_name",
|
|
||||||
Value: defaultScope.name,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_schema_url",
|
|
||||||
Value: defaultScope.schemaURL,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_version",
|
|
||||||
Value: defaultScope.version,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr1",
|
|
||||||
Value: "value1",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr2",
|
|
||||||
Value: "value2",
|
|
||||||
},
|
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "Successful conversion with resource attribute promotion and with scope promotion, conflicting resource attributes are ignored",
|
name: "Successful conversion with resource attribute promotion, conflicting resource attributes are ignored",
|
||||||
scope: defaultScope,
|
|
||||||
promoteResourceAttributes: []string{"non-existent-attr", "existent-attr", "metric-attr", "job", "instance"},
|
promoteResourceAttributes: []string{"non-existent-attr", "existent-attr", "metric-attr", "job", "instance"},
|
||||||
promoteScope: true,
|
|
||||||
expectedLabels: []prompb.Label{
|
expectedLabels: []prompb.Label{
|
||||||
{
|
{
|
||||||
Name: "__name__",
|
Name: "__name__",
|
||||||
@ -304,33 +167,11 @@ func TestCreateAttributes(t *testing.T) {
|
|||||||
Name: "metric_attr_other",
|
Name: "metric_attr_other",
|
||||||
Value: "metric value other",
|
Value: "metric value other",
|
||||||
},
|
},
|
||||||
{
|
|
||||||
Name: "otel_scope_name",
|
|
||||||
Value: defaultScope.name,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_schema_url",
|
|
||||||
Value: defaultScope.schemaURL,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_version",
|
|
||||||
Value: defaultScope.version,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr1",
|
|
||||||
Value: "value1",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr2",
|
|
||||||
Value: "value2",
|
|
||||||
},
|
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "Successful conversion with resource attribute promotion and with scope promotion, attributes are only promoted once",
|
name: "Successful conversion with resource attribute promotion, attributes are only promoted once",
|
||||||
scope: defaultScope,
|
|
||||||
promoteResourceAttributes: []string{"existent-attr", "existent-attr"},
|
promoteResourceAttributes: []string{"existent-attr", "existent-attr"},
|
||||||
promoteScope: true,
|
|
||||||
expectedLabels: []prompb.Label{
|
expectedLabels: []prompb.Label{
|
||||||
{
|
{
|
||||||
Name: "__name__",
|
Name: "__name__",
|
||||||
@ -356,33 +197,11 @@ func TestCreateAttributes(t *testing.T) {
|
|||||||
Name: "metric_attr_other",
|
Name: "metric_attr_other",
|
||||||
Value: "metric value other",
|
Value: "metric value other",
|
||||||
},
|
},
|
||||||
{
|
|
||||||
Name: "otel_scope_name",
|
|
||||||
Value: defaultScope.name,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_schema_url",
|
|
||||||
Value: defaultScope.schemaURL,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_version",
|
|
||||||
Value: defaultScope.version,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr1",
|
|
||||||
Value: "value1",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr2",
|
|
||||||
Value: "value2",
|
|
||||||
},
|
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "Successful conversion promoting all resource attributes and with scope promotion",
|
name: "Successful conversion promoting all resource attributes",
|
||||||
scope: defaultScope,
|
|
||||||
promoteAllResourceAttributes: true,
|
promoteAllResourceAttributes: true,
|
||||||
promoteScope: true,
|
|
||||||
expectedLabels: []prompb.Label{
|
expectedLabels: []prompb.Label{
|
||||||
{
|
{
|
||||||
Name: "__name__",
|
Name: "__name__",
|
||||||
@ -416,33 +235,11 @@ func TestCreateAttributes(t *testing.T) {
|
|||||||
Name: "service_instance_id",
|
Name: "service_instance_id",
|
||||||
Value: "service ID",
|
Value: "service ID",
|
||||||
},
|
},
|
||||||
{
|
|
||||||
Name: "otel_scope_name",
|
|
||||||
Value: defaultScope.name,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_schema_url",
|
|
||||||
Value: defaultScope.schemaURL,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_version",
|
|
||||||
Value: defaultScope.version,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr1",
|
|
||||||
Value: "value1",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr2",
|
|
||||||
Value: "value2",
|
|
||||||
},
|
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "Successful conversion promoting all resource attributes and with scope promotion, ignoring 'service.instance.id'",
|
name: "Successful conversion promoting all resource attributes, ignoring 'service.instance.id'",
|
||||||
scope: defaultScope,
|
|
||||||
promoteAllResourceAttributes: true,
|
promoteAllResourceAttributes: true,
|
||||||
promoteScope: true,
|
|
||||||
ignoreResourceAttributes: []string{
|
ignoreResourceAttributes: []string{
|
||||||
"service.instance.id",
|
"service.instance.id",
|
||||||
},
|
},
|
||||||
@ -475,26 +272,6 @@ func TestCreateAttributes(t *testing.T) {
|
|||||||
Name: "service_name",
|
Name: "service_name",
|
||||||
Value: "service name",
|
Value: "service name",
|
||||||
},
|
},
|
||||||
{
|
|
||||||
Name: "otel_scope_name",
|
|
||||||
Value: defaultScope.name,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_schema_url",
|
|
||||||
Value: defaultScope.schemaURL,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_version",
|
|
||||||
Value: defaultScope.version,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr1",
|
|
||||||
Value: "value1",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr2",
|
|
||||||
Value: "value2",
|
|
||||||
},
|
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
@ -506,9 +283,8 @@ func TestCreateAttributes(t *testing.T) {
|
|||||||
PromoteResourceAttributes: tc.promoteResourceAttributes,
|
PromoteResourceAttributes: tc.promoteResourceAttributes,
|
||||||
IgnoreResourceAttributes: tc.ignoreResourceAttributes,
|
IgnoreResourceAttributes: tc.ignoreResourceAttributes,
|
||||||
}),
|
}),
|
||||||
PromoteScopeMetadata: tc.promoteScope,
|
|
||||||
}
|
}
|
||||||
lbls := createAttributes(resource, attrs, tc.scope, settings, tc.ignoreAttrs, false, model.MetricNameLabel, "test_metric")
|
lbls := createAttributes(resource, attrs, settings, tc.ignoreAttrs, false, model.MetricNameLabel, "test_metric")
|
||||||
|
|
||||||
require.ElementsMatch(t, lbls, tc.expectedLabels)
|
require.ElementsMatch(t, lbls, tc.expectedLabels)
|
||||||
})
|
})
|
||||||
@ -534,28 +310,14 @@ func Test_convertTimeStamp(t *testing.T) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func TestPrometheusConverter_AddSummaryDataPoints(t *testing.T) {
|
func TestPrometheusConverter_AddSummaryDataPoints(t *testing.T) {
|
||||||
scopeAttrs := pcommon.NewMap()
|
|
||||||
scopeAttrs.FromRaw(map[string]any{
|
|
||||||
"attr1": "value1",
|
|
||||||
"attr2": "value2",
|
|
||||||
})
|
|
||||||
defaultScope := scope{
|
|
||||||
name: "test-scope",
|
|
||||||
version: "1.0.0",
|
|
||||||
schemaURL: "https://schema.com",
|
|
||||||
attributes: scopeAttrs,
|
|
||||||
}
|
|
||||||
|
|
||||||
ts := pcommon.Timestamp(time.Now().UnixNano())
|
ts := pcommon.Timestamp(time.Now().UnixNano())
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
name string
|
name string
|
||||||
metric func() pmetric.Metric
|
metric func() pmetric.Metric
|
||||||
scope scope
|
want func() map[uint64]*prompb.TimeSeries
|
||||||
promoteScope bool
|
|
||||||
want func() map[uint64]*prompb.TimeSeries
|
|
||||||
}{
|
}{
|
||||||
{
|
{
|
||||||
name: "summary with start time and without scope promotion",
|
name: "summary with start time",
|
||||||
metric: func() pmetric.Metric {
|
metric: func() pmetric.Metric {
|
||||||
metric := pmetric.NewMetric()
|
metric := pmetric.NewMetric()
|
||||||
metric.SetName("test_summary")
|
metric.SetName("test_summary")
|
||||||
@ -567,8 +329,6 @@ func TestPrometheusConverter_AddSummaryDataPoints(t *testing.T) {
|
|||||||
|
|
||||||
return metric
|
return metric
|
||||||
},
|
},
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: false,
|
|
||||||
want: func() map[uint64]*prompb.TimeSeries {
|
want: func() map[uint64]*prompb.TimeSeries {
|
||||||
countLabels := []prompb.Label{
|
countLabels := []prompb.Label{
|
||||||
{Name: model.MetricNameLabel, Value: "test_summary" + countStr},
|
{Name: model.MetricNameLabel, Value: "test_summary" + countStr},
|
||||||
@ -602,79 +362,7 @@ func TestPrometheusConverter_AddSummaryDataPoints(t *testing.T) {
|
|||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "summary with start time and with scope promotion",
|
name: "summary without start time",
|
||||||
metric: func() pmetric.Metric {
|
|
||||||
metric := pmetric.NewMetric()
|
|
||||||
metric.SetName("test_summary")
|
|
||||||
metric.SetEmptySummary()
|
|
||||||
|
|
||||||
dp := metric.Summary().DataPoints().AppendEmpty()
|
|
||||||
dp.SetTimestamp(ts)
|
|
||||||
dp.SetStartTimestamp(ts)
|
|
||||||
|
|
||||||
return metric
|
|
||||||
},
|
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: true,
|
|
||||||
want: func() map[uint64]*prompb.TimeSeries {
|
|
||||||
scopeLabels := []prompb.Label{
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr1",
|
|
||||||
Value: "value1",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr2",
|
|
||||||
Value: "value2",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_name",
|
|
||||||
Value: defaultScope.name,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_schema_url",
|
|
||||||
Value: defaultScope.schemaURL,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_version",
|
|
||||||
Value: defaultScope.version,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
countLabels := append([]prompb.Label{
|
|
||||||
{Name: model.MetricNameLabel, Value: "test_summary" + countStr},
|
|
||||||
}, scopeLabels...)
|
|
||||||
sumLabels := append([]prompb.Label{
|
|
||||||
{Name: model.MetricNameLabel, Value: "test_summary" + sumStr},
|
|
||||||
}, scopeLabels...)
|
|
||||||
createdLabels := append([]prompb.Label{
|
|
||||||
{
|
|
||||||
Name: model.MetricNameLabel,
|
|
||||||
Value: "test_summary" + createdSuffix,
|
|
||||||
},
|
|
||||||
}, scopeLabels...)
|
|
||||||
return map[uint64]*prompb.TimeSeries{
|
|
||||||
timeSeriesSignature(countLabels): {
|
|
||||||
Labels: countLabels,
|
|
||||||
Samples: []prompb.Sample{
|
|
||||||
{Value: 0, Timestamp: convertTimeStamp(ts)},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
timeSeriesSignature(sumLabels): {
|
|
||||||
Labels: sumLabels,
|
|
||||||
Samples: []prompb.Sample{
|
|
||||||
{Value: 0, Timestamp: convertTimeStamp(ts)},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
timeSeriesSignature(createdLabels): {
|
|
||||||
Labels: createdLabels,
|
|
||||||
Samples: []prompb.Sample{
|
|
||||||
{Value: float64(convertTimeStamp(ts)), Timestamp: convertTimeStamp(ts)},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
}
|
|
||||||
},
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: "summary without start time and without scope promotion",
|
|
||||||
metric: func() pmetric.Metric {
|
metric: func() pmetric.Metric {
|
||||||
metric := pmetric.NewMetric()
|
metric := pmetric.NewMetric()
|
||||||
metric.SetName("test_summary")
|
metric.SetName("test_summary")
|
||||||
@ -685,7 +373,6 @@ func TestPrometheusConverter_AddSummaryDataPoints(t *testing.T) {
|
|||||||
|
|
||||||
return metric
|
return metric
|
||||||
},
|
},
|
||||||
promoteScope: false,
|
|
||||||
want: func() map[uint64]*prompb.TimeSeries {
|
want: func() map[uint64]*prompb.TimeSeries {
|
||||||
countLabels := []prompb.Label{
|
countLabels := []prompb.Label{
|
||||||
{Name: model.MetricNameLabel, Value: "test_summary" + countStr},
|
{Name: model.MetricNameLabel, Value: "test_summary" + countStr},
|
||||||
@ -720,11 +407,9 @@ func TestPrometheusConverter_AddSummaryDataPoints(t *testing.T) {
|
|||||||
metric.Summary().DataPoints(),
|
metric.Summary().DataPoints(),
|
||||||
pcommon.NewResource(),
|
pcommon.NewResource(),
|
||||||
Settings{
|
Settings{
|
||||||
PromoteScopeMetadata: tt.promoteScope,
|
ExportCreatedMetric: true,
|
||||||
ExportCreatedMetric: true,
|
|
||||||
},
|
},
|
||||||
metric.Name(),
|
metric.Name(),
|
||||||
tt.scope,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
testutil.RequireEqual(t, tt.want(), converter.unique)
|
testutil.RequireEqual(t, tt.want(), converter.unique)
|
||||||
@ -734,28 +419,14 @@ func TestPrometheusConverter_AddSummaryDataPoints(t *testing.T) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func TestPrometheusConverter_AddHistogramDataPoints(t *testing.T) {
|
func TestPrometheusConverter_AddHistogramDataPoints(t *testing.T) {
|
||||||
scopeAttrs := pcommon.NewMap()
|
|
||||||
scopeAttrs.FromRaw(map[string]any{
|
|
||||||
"attr1": "value1",
|
|
||||||
"attr2": "value2",
|
|
||||||
})
|
|
||||||
defaultScope := scope{
|
|
||||||
name: "test-scope",
|
|
||||||
version: "1.0.0",
|
|
||||||
schemaURL: "https://schema.com",
|
|
||||||
attributes: scopeAttrs,
|
|
||||||
}
|
|
||||||
|
|
||||||
ts := pcommon.Timestamp(time.Now().UnixNano())
|
ts := pcommon.Timestamp(time.Now().UnixNano())
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
name string
|
name string
|
||||||
metric func() pmetric.Metric
|
metric func() pmetric.Metric
|
||||||
scope scope
|
want func() map[uint64]*prompb.TimeSeries
|
||||||
promoteScope bool
|
|
||||||
want func() map[uint64]*prompb.TimeSeries
|
|
||||||
}{
|
}{
|
||||||
{
|
{
|
||||||
name: "histogram with start time and without scope promotion",
|
name: "histogram with start time",
|
||||||
metric: func() pmetric.Metric {
|
metric: func() pmetric.Metric {
|
||||||
metric := pmetric.NewMetric()
|
metric := pmetric.NewMetric()
|
||||||
metric.SetName("test_hist")
|
metric.SetName("test_hist")
|
||||||
@ -767,8 +438,6 @@ func TestPrometheusConverter_AddHistogramDataPoints(t *testing.T) {
|
|||||||
|
|
||||||
return metric
|
return metric
|
||||||
},
|
},
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: false,
|
|
||||||
want: func() map[uint64]*prompb.TimeSeries {
|
want: func() map[uint64]*prompb.TimeSeries {
|
||||||
countLabels := []prompb.Label{
|
countLabels := []prompb.Label{
|
||||||
{Name: model.MetricNameLabel, Value: "test_hist" + countStr},
|
{Name: model.MetricNameLabel, Value: "test_hist" + countStr},
|
||||||
@ -802,76 +471,6 @@ func TestPrometheusConverter_AddHistogramDataPoints(t *testing.T) {
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
|
||||||
name: "histogram with start time and with scope promotion",
|
|
||||||
metric: func() pmetric.Metric {
|
|
||||||
metric := pmetric.NewMetric()
|
|
||||||
metric.SetName("test_hist")
|
|
||||||
metric.SetEmptyHistogram().SetAggregationTemporality(pmetric.AggregationTemporalityCumulative)
|
|
||||||
|
|
||||||
pt := metric.Histogram().DataPoints().AppendEmpty()
|
|
||||||
pt.SetTimestamp(ts)
|
|
||||||
pt.SetStartTimestamp(ts)
|
|
||||||
|
|
||||||
return metric
|
|
||||||
},
|
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: true,
|
|
||||||
want: func() map[uint64]*prompb.TimeSeries {
|
|
||||||
scopeLabels := []prompb.Label{
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr1",
|
|
||||||
Value: "value1",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_attr2",
|
|
||||||
Value: "value2",
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_name",
|
|
||||||
Value: defaultScope.name,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_schema_url",
|
|
||||||
Value: defaultScope.schemaURL,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Name: "otel_scope_version",
|
|
||||||
Value: defaultScope.version,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
countLabels := append([]prompb.Label{
|
|
||||||
{Name: model.MetricNameLabel, Value: "test_hist" + countStr},
|
|
||||||
}, scopeLabels...)
|
|
||||||
infLabels := append([]prompb.Label{
|
|
||||||
{Name: model.MetricNameLabel, Value: "test_hist_bucket"},
|
|
||||||
{Name: model.BucketLabel, Value: "+Inf"},
|
|
||||||
}, scopeLabels...)
|
|
||||||
createdLabels := append([]prompb.Label{
|
|
||||||
{Name: model.MetricNameLabel, Value: "test_hist" + createdSuffix},
|
|
||||||
}, scopeLabels...)
|
|
||||||
return map[uint64]*prompb.TimeSeries{
|
|
||||||
timeSeriesSignature(countLabels): {
|
|
||||||
Labels: countLabels,
|
|
||||||
Samples: []prompb.Sample{
|
|
||||||
{Value: 0, Timestamp: convertTimeStamp(ts)},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
timeSeriesSignature(infLabels): {
|
|
||||||
Labels: infLabels,
|
|
||||||
Samples: []prompb.Sample{
|
|
||||||
{Value: 0, Timestamp: convertTimeStamp(ts)},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
timeSeriesSignature(createdLabels): {
|
|
||||||
Labels: createdLabels,
|
|
||||||
Samples: []prompb.Sample{
|
|
||||||
{Value: float64(convertTimeStamp(ts)), Timestamp: convertTimeStamp(ts)},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
}
|
|
||||||
},
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
name: "histogram without start time",
|
name: "histogram without start time",
|
||||||
metric: func() pmetric.Metric {
|
metric: func() pmetric.Metric {
|
||||||
@ -919,11 +518,9 @@ func TestPrometheusConverter_AddHistogramDataPoints(t *testing.T) {
|
|||||||
metric.Histogram().DataPoints(),
|
metric.Histogram().DataPoints(),
|
||||||
pcommon.NewResource(),
|
pcommon.NewResource(),
|
||||||
Settings{
|
Settings{
|
||||||
ExportCreatedMetric: true,
|
ExportCreatedMetric: true,
|
||||||
PromoteScopeMetadata: tt.promoteScope,
|
|
||||||
},
|
},
|
||||||
metric.Name(),
|
metric.Name(),
|
||||||
tt.scope,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
require.Equal(t, tt.want(), converter.unique)
|
require.Equal(t, tt.want(), converter.unique)
|
||||||
|
@ -37,7 +37,6 @@ const defaultZeroThreshold = 1e-128
|
|||||||
// as native histogram samples.
|
// as native histogram samples.
|
||||||
func (c *PrometheusConverter) addExponentialHistogramDataPoints(ctx context.Context, dataPoints pmetric.ExponentialHistogramDataPointSlice,
|
func (c *PrometheusConverter) addExponentialHistogramDataPoints(ctx context.Context, dataPoints pmetric.ExponentialHistogramDataPointSlice,
|
||||||
resource pcommon.Resource, settings Settings, promName string, temporality pmetric.AggregationTemporality,
|
resource pcommon.Resource, settings Settings, promName string, temporality pmetric.AggregationTemporality,
|
||||||
scope scope,
|
|
||||||
) (annotations.Annotations, error) {
|
) (annotations.Annotations, error) {
|
||||||
var annots annotations.Annotations
|
var annots annotations.Annotations
|
||||||
for x := 0; x < dataPoints.Len(); x++ {
|
for x := 0; x < dataPoints.Len(); x++ {
|
||||||
@ -56,7 +55,6 @@ func (c *PrometheusConverter) addExponentialHistogramDataPoints(ctx context.Cont
|
|||||||
lbls := createAttributes(
|
lbls := createAttributes(
|
||||||
resource,
|
resource,
|
||||||
pt.Attributes(),
|
pt.Attributes(),
|
||||||
scope,
|
|
||||||
settings,
|
settings,
|
||||||
nil,
|
nil,
|
||||||
true,
|
true,
|
||||||
@ -254,7 +252,6 @@ func convertBucketsLayout(bucketCounts []uint64, offset, scaleDown int32, adjust
|
|||||||
|
|
||||||
func (c *PrometheusConverter) addCustomBucketsHistogramDataPoints(ctx context.Context, dataPoints pmetric.HistogramDataPointSlice,
|
func (c *PrometheusConverter) addCustomBucketsHistogramDataPoints(ctx context.Context, dataPoints pmetric.HistogramDataPointSlice,
|
||||||
resource pcommon.Resource, settings Settings, promName string, temporality pmetric.AggregationTemporality,
|
resource pcommon.Resource, settings Settings, promName string, temporality pmetric.AggregationTemporality,
|
||||||
scope scope,
|
|
||||||
) (annotations.Annotations, error) {
|
) (annotations.Annotations, error) {
|
||||||
var annots annotations.Annotations
|
var annots annotations.Annotations
|
||||||
|
|
||||||
@ -274,7 +271,6 @@ func (c *PrometheusConverter) addCustomBucketsHistogramDataPoints(ctx context.Co
|
|||||||
lbls := createAttributes(
|
lbls := createAttributes(
|
||||||
resource,
|
resource,
|
||||||
pt.Attributes(),
|
pt.Attributes(),
|
||||||
scope,
|
|
||||||
settings,
|
settings,
|
||||||
nil,
|
nil,
|
||||||
true,
|
true,
|
||||||
|
@ -620,27 +620,13 @@ func validateNativeHistogramCount(t *testing.T, h prompb.Histogram) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func TestPrometheusConverter_addExponentialHistogramDataPoints(t *testing.T) {
|
func TestPrometheusConverter_addExponentialHistogramDataPoints(t *testing.T) {
|
||||||
scopeAttrs := pcommon.NewMap()
|
|
||||||
scopeAttrs.FromRaw(map[string]any{
|
|
||||||
"attr1": "value1",
|
|
||||||
"attr2": "value2",
|
|
||||||
})
|
|
||||||
defaultScope := scope{
|
|
||||||
name: "test-scope",
|
|
||||||
version: "1.0.0",
|
|
||||||
schemaURL: "https://schema.com",
|
|
||||||
attributes: scopeAttrs,
|
|
||||||
}
|
|
||||||
|
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
name string
|
name string
|
||||||
metric func() pmetric.Metric
|
metric func() pmetric.Metric
|
||||||
scope scope
|
wantSeries func() map[uint64]*prompb.TimeSeries
|
||||||
promoteScope bool
|
|
||||||
wantSeries func() map[uint64]*prompb.TimeSeries
|
|
||||||
}{
|
}{
|
||||||
{
|
{
|
||||||
name: "histogram data points with same labels and without scope promotion",
|
name: "histogram data points with same labels",
|
||||||
metric: func() pmetric.Metric {
|
metric: func() pmetric.Metric {
|
||||||
metric := pmetric.NewMetric()
|
metric := pmetric.NewMetric()
|
||||||
metric.SetName("test_hist")
|
metric.SetName("test_hist")
|
||||||
@ -664,8 +650,6 @@ func TestPrometheusConverter_addExponentialHistogramDataPoints(t *testing.T) {
|
|||||||
|
|
||||||
return metric
|
return metric
|
||||||
},
|
},
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: false,
|
|
||||||
wantSeries: func() map[uint64]*prompb.TimeSeries {
|
wantSeries: func() map[uint64]*prompb.TimeSeries {
|
||||||
labels := []prompb.Label{
|
labels := []prompb.Label{
|
||||||
{Name: model.MetricNameLabel, Value: "test_hist"},
|
{Name: model.MetricNameLabel, Value: "test_hist"},
|
||||||
@ -701,73 +685,7 @@ func TestPrometheusConverter_addExponentialHistogramDataPoints(t *testing.T) {
|
|||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "histogram data points with same labels and with scope promotion",
|
name: "histogram data points with different labels",
|
||||||
metric: func() pmetric.Metric {
|
|
||||||
metric := pmetric.NewMetric()
|
|
||||||
metric.SetName("test_hist")
|
|
||||||
metric.SetEmptyExponentialHistogram().SetAggregationTemporality(pmetric.AggregationTemporalityCumulative)
|
|
||||||
|
|
||||||
pt := metric.ExponentialHistogram().DataPoints().AppendEmpty()
|
|
||||||
pt.SetCount(7)
|
|
||||||
pt.SetScale(1)
|
|
||||||
pt.Positive().SetOffset(-1)
|
|
||||||
pt.Positive().BucketCounts().FromRaw([]uint64{4, 2})
|
|
||||||
pt.Exemplars().AppendEmpty().SetDoubleValue(1)
|
|
||||||
pt.Attributes().PutStr("attr", "test_attr")
|
|
||||||
|
|
||||||
pt = metric.ExponentialHistogram().DataPoints().AppendEmpty()
|
|
||||||
pt.SetCount(4)
|
|
||||||
pt.SetScale(1)
|
|
||||||
pt.Positive().SetOffset(-1)
|
|
||||||
pt.Positive().BucketCounts().FromRaw([]uint64{4, 2, 1})
|
|
||||||
pt.Exemplars().AppendEmpty().SetDoubleValue(2)
|
|
||||||
pt.Attributes().PutStr("attr", "test_attr")
|
|
||||||
|
|
||||||
return metric
|
|
||||||
},
|
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: true,
|
|
||||||
wantSeries: func() map[uint64]*prompb.TimeSeries {
|
|
||||||
labels := []prompb.Label{
|
|
||||||
{Name: model.MetricNameLabel, Value: "test_hist"},
|
|
||||||
{Name: "attr", Value: "test_attr"},
|
|
||||||
{Name: "otel_scope_name", Value: defaultScope.name},
|
|
||||||
{Name: "otel_scope_schema_url", Value: defaultScope.schemaURL},
|
|
||||||
{Name: "otel_scope_version", Value: defaultScope.version},
|
|
||||||
{Name: "otel_scope_attr1", Value: "value1"},
|
|
||||||
{Name: "otel_scope_attr2", Value: "value2"},
|
|
||||||
}
|
|
||||||
return map[uint64]*prompb.TimeSeries{
|
|
||||||
timeSeriesSignature(labels): {
|
|
||||||
Labels: labels,
|
|
||||||
Histograms: []prompb.Histogram{
|
|
||||||
{
|
|
||||||
Count: &prompb.Histogram_CountInt{CountInt: 7},
|
|
||||||
Schema: 1,
|
|
||||||
ZeroThreshold: defaultZeroThreshold,
|
|
||||||
ZeroCount: &prompb.Histogram_ZeroCountInt{ZeroCountInt: 0},
|
|
||||||
PositiveSpans: []prompb.BucketSpan{{Offset: 0, Length: 2}},
|
|
||||||
PositiveDeltas: []int64{4, -2},
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Count: &prompb.Histogram_CountInt{CountInt: 4},
|
|
||||||
Schema: 1,
|
|
||||||
ZeroThreshold: defaultZeroThreshold,
|
|
||||||
ZeroCount: &prompb.Histogram_ZeroCountInt{ZeroCountInt: 0},
|
|
||||||
PositiveSpans: []prompb.BucketSpan{{Offset: 0, Length: 3}},
|
|
||||||
PositiveDeltas: []int64{4, -2, -1},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
Exemplars: []prompb.Exemplar{
|
|
||||||
{Value: 1},
|
|
||||||
{Value: 2},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
}
|
|
||||||
},
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: "histogram data points with different labels and without scope promotion",
|
|
||||||
metric: func() pmetric.Metric {
|
metric: func() pmetric.Metric {
|
||||||
metric := pmetric.NewMetric()
|
metric := pmetric.NewMetric()
|
||||||
metric.SetName("test_hist")
|
metric.SetName("test_hist")
|
||||||
@ -791,8 +709,6 @@ func TestPrometheusConverter_addExponentialHistogramDataPoints(t *testing.T) {
|
|||||||
|
|
||||||
return metric
|
return metric
|
||||||
},
|
},
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: false,
|
|
||||||
wantSeries: func() map[uint64]*prompb.TimeSeries {
|
wantSeries: func() map[uint64]*prompb.TimeSeries {
|
||||||
labels := []prompb.Label{
|
labels := []prompb.Label{
|
||||||
{Name: model.MetricNameLabel, Value: "test_hist"},
|
{Name: model.MetricNameLabel, Value: "test_hist"},
|
||||||
@ -853,12 +769,10 @@ func TestPrometheusConverter_addExponentialHistogramDataPoints(t *testing.T) {
|
|||||||
metric.ExponentialHistogram().DataPoints(),
|
metric.ExponentialHistogram().DataPoints(),
|
||||||
pcommon.NewResource(),
|
pcommon.NewResource(),
|
||||||
Settings{
|
Settings{
|
||||||
ExportCreatedMetric: true,
|
ExportCreatedMetric: true,
|
||||||
PromoteScopeMetadata: tt.promoteScope,
|
|
||||||
},
|
},
|
||||||
namer.Build(TranslatorMetricFromOtelMetric(metric)),
|
namer.Build(TranslatorMetricFromOtelMetric(metric)),
|
||||||
pmetric.AggregationTemporalityCumulative,
|
pmetric.AggregationTemporalityCumulative,
|
||||||
tt.scope,
|
|
||||||
)
|
)
|
||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
require.Empty(t, annots)
|
require.Empty(t, annots)
|
||||||
@ -1077,27 +991,13 @@ func TestHistogramToCustomBucketsHistogram(t *testing.T) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func TestPrometheusConverter_addCustomBucketsHistogramDataPoints(t *testing.T) {
|
func TestPrometheusConverter_addCustomBucketsHistogramDataPoints(t *testing.T) {
|
||||||
scopeAttrs := pcommon.NewMap()
|
|
||||||
scopeAttrs.FromRaw(map[string]any{
|
|
||||||
"attr1": "value1",
|
|
||||||
"attr2": "value2",
|
|
||||||
})
|
|
||||||
defaultScope := scope{
|
|
||||||
name: "test-scope",
|
|
||||||
version: "1.0.0",
|
|
||||||
schemaURL: "https://schema.com",
|
|
||||||
attributes: scopeAttrs,
|
|
||||||
}
|
|
||||||
|
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
name string
|
name string
|
||||||
metric func() pmetric.Metric
|
metric func() pmetric.Metric
|
||||||
scope scope
|
wantSeries func() map[uint64]*prompb.TimeSeries
|
||||||
promoteScope bool
|
|
||||||
wantSeries func() map[uint64]*prompb.TimeSeries
|
|
||||||
}{
|
}{
|
||||||
{
|
{
|
||||||
name: "histogram data points with same labels and without scope promotion",
|
name: "histogram data points with same labels",
|
||||||
metric: func() pmetric.Metric {
|
metric: func() pmetric.Metric {
|
||||||
metric := pmetric.NewMetric()
|
metric := pmetric.NewMetric()
|
||||||
metric.SetName("test_hist_to_nhcb")
|
metric.SetName("test_hist_to_nhcb")
|
||||||
@ -1121,8 +1021,6 @@ func TestPrometheusConverter_addCustomBucketsHistogramDataPoints(t *testing.T) {
|
|||||||
|
|
||||||
return metric
|
return metric
|
||||||
},
|
},
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: false,
|
|
||||||
wantSeries: func() map[uint64]*prompb.TimeSeries {
|
wantSeries: func() map[uint64]*prompb.TimeSeries {
|
||||||
labels := []prompb.Label{
|
labels := []prompb.Label{
|
||||||
{Name: model.MetricNameLabel, Value: "test_hist_to_nhcb"},
|
{Name: model.MetricNameLabel, Value: "test_hist_to_nhcb"},
|
||||||
@ -1158,73 +1056,7 @@ func TestPrometheusConverter_addCustomBucketsHistogramDataPoints(t *testing.T) {
|
|||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "histogram data points with same labels and with scope promotion",
|
name: "histogram data points with different labels",
|
||||||
metric: func() pmetric.Metric {
|
|
||||||
metric := pmetric.NewMetric()
|
|
||||||
metric.SetName("test_hist_to_nhcb")
|
|
||||||
metric.SetEmptyHistogram().SetAggregationTemporality(pmetric.AggregationTemporalityCumulative)
|
|
||||||
|
|
||||||
pt := metric.Histogram().DataPoints().AppendEmpty()
|
|
||||||
pt.SetCount(3)
|
|
||||||
pt.SetSum(3)
|
|
||||||
pt.BucketCounts().FromRaw([]uint64{2, 0, 1})
|
|
||||||
pt.ExplicitBounds().FromRaw([]float64{5, 10})
|
|
||||||
pt.Exemplars().AppendEmpty().SetDoubleValue(1)
|
|
||||||
pt.Attributes().PutStr("attr", "test_attr")
|
|
||||||
|
|
||||||
pt = metric.Histogram().DataPoints().AppendEmpty()
|
|
||||||
pt.SetCount(11)
|
|
||||||
pt.SetSum(5)
|
|
||||||
pt.BucketCounts().FromRaw([]uint64{3, 8, 0})
|
|
||||||
pt.ExplicitBounds().FromRaw([]float64{0, 1})
|
|
||||||
pt.Exemplars().AppendEmpty().SetDoubleValue(2)
|
|
||||||
pt.Attributes().PutStr("attr", "test_attr")
|
|
||||||
|
|
||||||
return metric
|
|
||||||
},
|
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: true,
|
|
||||||
wantSeries: func() map[uint64]*prompb.TimeSeries {
|
|
||||||
labels := []prompb.Label{
|
|
||||||
{Name: model.MetricNameLabel, Value: "test_hist_to_nhcb"},
|
|
||||||
{Name: "attr", Value: "test_attr"},
|
|
||||||
{Name: "otel_scope_name", Value: defaultScope.name},
|
|
||||||
{Name: "otel_scope_schema_url", Value: defaultScope.schemaURL},
|
|
||||||
{Name: "otel_scope_version", Value: defaultScope.version},
|
|
||||||
{Name: "otel_scope_attr1", Value: "value1"},
|
|
||||||
{Name: "otel_scope_attr2", Value: "value2"},
|
|
||||||
}
|
|
||||||
return map[uint64]*prompb.TimeSeries{
|
|
||||||
timeSeriesSignature(labels): {
|
|
||||||
Labels: labels,
|
|
||||||
Histograms: []prompb.Histogram{
|
|
||||||
{
|
|
||||||
Count: &prompb.Histogram_CountInt{CountInt: 3},
|
|
||||||
Sum: 3,
|
|
||||||
Schema: -53,
|
|
||||||
PositiveSpans: []prompb.BucketSpan{{Offset: 0, Length: 3}},
|
|
||||||
PositiveDeltas: []int64{2, -2, 1},
|
|
||||||
CustomValues: []float64{5, 10},
|
|
||||||
},
|
|
||||||
{
|
|
||||||
Count: &prompb.Histogram_CountInt{CountInt: 11},
|
|
||||||
Sum: 5,
|
|
||||||
Schema: -53,
|
|
||||||
PositiveSpans: []prompb.BucketSpan{{Offset: 0, Length: 3}},
|
|
||||||
PositiveDeltas: []int64{3, 5, -8},
|
|
||||||
CustomValues: []float64{0, 1},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
Exemplars: []prompb.Exemplar{
|
|
||||||
{Value: 1},
|
|
||||||
{Value: 2},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
}
|
|
||||||
},
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: "histogram data points with different labels and without scope promotion",
|
|
||||||
metric: func() pmetric.Metric {
|
metric: func() pmetric.Metric {
|
||||||
metric := pmetric.NewMetric()
|
metric := pmetric.NewMetric()
|
||||||
metric.SetName("test_hist_to_nhcb")
|
metric.SetName("test_hist_to_nhcb")
|
||||||
@ -1248,8 +1080,6 @@ func TestPrometheusConverter_addCustomBucketsHistogramDataPoints(t *testing.T) {
|
|||||||
|
|
||||||
return metric
|
return metric
|
||||||
},
|
},
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: false,
|
|
||||||
wantSeries: func() map[uint64]*prompb.TimeSeries {
|
wantSeries: func() map[uint64]*prompb.TimeSeries {
|
||||||
labels := []prompb.Label{
|
labels := []prompb.Label{
|
||||||
{Name: model.MetricNameLabel, Value: "test_hist_to_nhcb"},
|
{Name: model.MetricNameLabel, Value: "test_hist_to_nhcb"},
|
||||||
@ -1312,11 +1142,9 @@ func TestPrometheusConverter_addCustomBucketsHistogramDataPoints(t *testing.T) {
|
|||||||
Settings{
|
Settings{
|
||||||
ExportCreatedMetric: true,
|
ExportCreatedMetric: true,
|
||||||
ConvertHistogramsToNHCB: true,
|
ConvertHistogramsToNHCB: true,
|
||||||
PromoteScopeMetadata: tt.promoteScope,
|
|
||||||
},
|
},
|
||||||
namer.Build(TranslatorMetricFromOtelMetric(metric)),
|
namer.Build(TranslatorMetricFromOtelMetric(metric)),
|
||||||
pmetric.AggregationTemporalityCumulative,
|
pmetric.AggregationTemporalityCumulative,
|
||||||
tt.scope,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
require.NoError(t, err)
|
require.NoError(t, err)
|
||||||
|
@ -50,8 +50,6 @@ type Settings struct {
|
|||||||
KeepIdentifyingResourceAttributes bool
|
KeepIdentifyingResourceAttributes bool
|
||||||
ConvertHistogramsToNHCB bool
|
ConvertHistogramsToNHCB bool
|
||||||
AllowDeltaTemporality bool
|
AllowDeltaTemporality bool
|
||||||
// PromoteScopeMetadata controls whether to promote OTel scope metadata to metric labels.
|
|
||||||
PromoteScopeMetadata bool
|
|
||||||
// LookbackDelta is the PromQL engine lookback delta.
|
// LookbackDelta is the PromQL engine lookback delta.
|
||||||
LookbackDelta time.Duration
|
LookbackDelta time.Duration
|
||||||
}
|
}
|
||||||
@ -96,23 +94,6 @@ func TranslatorMetricFromOtelMetric(metric pmetric.Metric) otlptranslator.Metric
|
|||||||
return m
|
return m
|
||||||
}
|
}
|
||||||
|
|
||||||
type scope struct {
|
|
||||||
name string
|
|
||||||
version string
|
|
||||||
schemaURL string
|
|
||||||
attributes pcommon.Map
|
|
||||||
}
|
|
||||||
|
|
||||||
func newScopeFromScopeMetrics(scopeMetrics pmetric.ScopeMetrics) scope {
|
|
||||||
s := scopeMetrics.Scope()
|
|
||||||
return scope{
|
|
||||||
name: s.Name(),
|
|
||||||
version: s.Version(),
|
|
||||||
schemaURL: scopeMetrics.SchemaUrl(),
|
|
||||||
attributes: s.Attributes(),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// FromMetrics converts pmetric.Metrics to Prometheus remote write format.
|
// FromMetrics converts pmetric.Metrics to Prometheus remote write format.
|
||||||
func (c *PrometheusConverter) FromMetrics(ctx context.Context, md pmetric.Metrics, settings Settings) (annots annotations.Annotations, errs error) {
|
func (c *PrometheusConverter) FromMetrics(ctx context.Context, md pmetric.Metrics, settings Settings) (annots annotations.Annotations, errs error) {
|
||||||
namer := otlptranslator.MetricNamer{
|
namer := otlptranslator.MetricNamer{
|
||||||
@ -141,9 +122,7 @@ func (c *PrometheusConverter) FromMetrics(ctx context.Context, md pmetric.Metric
|
|||||||
earliestTimestamp := pcommon.Timestamp(math.MaxUint64)
|
earliestTimestamp := pcommon.Timestamp(math.MaxUint64)
|
||||||
latestTimestamp := pcommon.Timestamp(0)
|
latestTimestamp := pcommon.Timestamp(0)
|
||||||
for j := 0; j < scopeMetricsSlice.Len(); j++ {
|
for j := 0; j < scopeMetricsSlice.Len(); j++ {
|
||||||
scopeMetrics := scopeMetricsSlice.At(j)
|
metricSlice := scopeMetricsSlice.At(j).Metrics()
|
||||||
scope := newScopeFromScopeMetrics(scopeMetrics)
|
|
||||||
metricSlice := scopeMetrics.Metrics()
|
|
||||||
|
|
||||||
// TODO: decide if instrumentation library information should be exported as labels
|
// TODO: decide if instrumentation library information should be exported as labels
|
||||||
for k := 0; k < metricSlice.Len(); k++ {
|
for k := 0; k < metricSlice.Len(); k++ {
|
||||||
@ -187,7 +166,7 @@ func (c *PrometheusConverter) FromMetrics(ctx context.Context, md pmetric.Metric
|
|||||||
errs = multierr.Append(errs, fmt.Errorf("empty data points. %s is dropped", metric.Name()))
|
errs = multierr.Append(errs, fmt.Errorf("empty data points. %s is dropped", metric.Name()))
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
if err := c.addGaugeNumberDataPoints(ctx, dataPoints, resource, settings, promName, scope); err != nil {
|
if err := c.addGaugeNumberDataPoints(ctx, dataPoints, resource, settings, promName); err != nil {
|
||||||
errs = multierr.Append(errs, err)
|
errs = multierr.Append(errs, err)
|
||||||
if errors.Is(err, context.Canceled) || errors.Is(err, context.DeadlineExceeded) {
|
if errors.Is(err, context.Canceled) || errors.Is(err, context.DeadlineExceeded) {
|
||||||
return
|
return
|
||||||
@ -199,7 +178,7 @@ func (c *PrometheusConverter) FromMetrics(ctx context.Context, md pmetric.Metric
|
|||||||
errs = multierr.Append(errs, fmt.Errorf("empty data points. %s is dropped", metric.Name()))
|
errs = multierr.Append(errs, fmt.Errorf("empty data points. %s is dropped", metric.Name()))
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
if err := c.addSumNumberDataPoints(ctx, dataPoints, resource, metric, settings, promName, scope); err != nil {
|
if err := c.addSumNumberDataPoints(ctx, dataPoints, resource, metric, settings, promName); err != nil {
|
||||||
errs = multierr.Append(errs, err)
|
errs = multierr.Append(errs, err)
|
||||||
if errors.Is(err, context.Canceled) || errors.Is(err, context.DeadlineExceeded) {
|
if errors.Is(err, context.Canceled) || errors.Is(err, context.DeadlineExceeded) {
|
||||||
return
|
return
|
||||||
@ -213,7 +192,7 @@ func (c *PrometheusConverter) FromMetrics(ctx context.Context, md pmetric.Metric
|
|||||||
}
|
}
|
||||||
if settings.ConvertHistogramsToNHCB {
|
if settings.ConvertHistogramsToNHCB {
|
||||||
ws, err := c.addCustomBucketsHistogramDataPoints(
|
ws, err := c.addCustomBucketsHistogramDataPoints(
|
||||||
ctx, dataPoints, resource, settings, promName, temporality, scope,
|
ctx, dataPoints, resource, settings, promName, temporality,
|
||||||
)
|
)
|
||||||
annots.Merge(ws)
|
annots.Merge(ws)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@ -223,7 +202,7 @@ func (c *PrometheusConverter) FromMetrics(ctx context.Context, md pmetric.Metric
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
if err := c.addHistogramDataPoints(ctx, dataPoints, resource, settings, promName, scope); err != nil {
|
if err := c.addHistogramDataPoints(ctx, dataPoints, resource, settings, promName); err != nil {
|
||||||
errs = multierr.Append(errs, err)
|
errs = multierr.Append(errs, err)
|
||||||
if errors.Is(err, context.Canceled) || errors.Is(err, context.DeadlineExceeded) {
|
if errors.Is(err, context.Canceled) || errors.Is(err, context.DeadlineExceeded) {
|
||||||
return
|
return
|
||||||
@ -243,7 +222,6 @@ func (c *PrometheusConverter) FromMetrics(ctx context.Context, md pmetric.Metric
|
|||||||
settings,
|
settings,
|
||||||
promName,
|
promName,
|
||||||
temporality,
|
temporality,
|
||||||
scope,
|
|
||||||
)
|
)
|
||||||
annots.Merge(ws)
|
annots.Merge(ws)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@ -258,7 +236,7 @@ func (c *PrometheusConverter) FromMetrics(ctx context.Context, md pmetric.Metric
|
|||||||
errs = multierr.Append(errs, fmt.Errorf("empty data points. %s is dropped", metric.Name()))
|
errs = multierr.Append(errs, fmt.Errorf("empty data points. %s is dropped", metric.Name()))
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
if err := c.addSummaryDataPoints(ctx, dataPoints, resource, settings, promName, scope); err != nil {
|
if err := c.addSummaryDataPoints(ctx, dataPoints, resource, settings, promName); err != nil {
|
||||||
errs = multierr.Append(errs, err)
|
errs = multierr.Append(errs, err)
|
||||||
if errors.Is(err, context.Canceled) || errors.Is(err, context.DeadlineExceeded) {
|
if errors.Is(err, context.Canceled) || errors.Is(err, context.DeadlineExceeded) {
|
||||||
return
|
return
|
||||||
|
@ -29,7 +29,7 @@ import (
|
|||||||
)
|
)
|
||||||
|
|
||||||
func (c *PrometheusConverter) addGaugeNumberDataPoints(ctx context.Context, dataPoints pmetric.NumberDataPointSlice,
|
func (c *PrometheusConverter) addGaugeNumberDataPoints(ctx context.Context, dataPoints pmetric.NumberDataPointSlice,
|
||||||
resource pcommon.Resource, settings Settings, name string, scope scope,
|
resource pcommon.Resource, settings Settings, name string,
|
||||||
) error {
|
) error {
|
||||||
for x := 0; x < dataPoints.Len(); x++ {
|
for x := 0; x < dataPoints.Len(); x++ {
|
||||||
if err := c.everyN.checkContext(ctx); err != nil {
|
if err := c.everyN.checkContext(ctx); err != nil {
|
||||||
@ -40,7 +40,6 @@ func (c *PrometheusConverter) addGaugeNumberDataPoints(ctx context.Context, data
|
|||||||
labels := createAttributes(
|
labels := createAttributes(
|
||||||
resource,
|
resource,
|
||||||
pt.Attributes(),
|
pt.Attributes(),
|
||||||
scope,
|
|
||||||
settings,
|
settings,
|
||||||
nil,
|
nil,
|
||||||
true,
|
true,
|
||||||
@ -67,7 +66,7 @@ func (c *PrometheusConverter) addGaugeNumberDataPoints(ctx context.Context, data
|
|||||||
}
|
}
|
||||||
|
|
||||||
func (c *PrometheusConverter) addSumNumberDataPoints(ctx context.Context, dataPoints pmetric.NumberDataPointSlice,
|
func (c *PrometheusConverter) addSumNumberDataPoints(ctx context.Context, dataPoints pmetric.NumberDataPointSlice,
|
||||||
resource pcommon.Resource, metric pmetric.Metric, settings Settings, name string, scope scope,
|
resource pcommon.Resource, metric pmetric.Metric, settings Settings, name string,
|
||||||
) error {
|
) error {
|
||||||
for x := 0; x < dataPoints.Len(); x++ {
|
for x := 0; x < dataPoints.Len(); x++ {
|
||||||
if err := c.everyN.checkContext(ctx); err != nil {
|
if err := c.everyN.checkContext(ctx); err != nil {
|
||||||
@ -78,7 +77,6 @@ func (c *PrometheusConverter) addSumNumberDataPoints(ctx context.Context, dataPo
|
|||||||
lbls := createAttributes(
|
lbls := createAttributes(
|
||||||
resource,
|
resource,
|
||||||
pt.Attributes(),
|
pt.Attributes(),
|
||||||
scope,
|
|
||||||
settings,
|
settings,
|
||||||
nil,
|
nil,
|
||||||
true,
|
true,
|
||||||
|
@ -30,27 +30,14 @@ import (
|
|||||||
)
|
)
|
||||||
|
|
||||||
func TestPrometheusConverter_addGaugeNumberDataPoints(t *testing.T) {
|
func TestPrometheusConverter_addGaugeNumberDataPoints(t *testing.T) {
|
||||||
scopeAttrs := pcommon.NewMap()
|
|
||||||
scopeAttrs.FromRaw(map[string]any{
|
|
||||||
"attr1": "value1",
|
|
||||||
"attr2": "value2",
|
|
||||||
})
|
|
||||||
defaultScope := scope{
|
|
||||||
name: "test-scope",
|
|
||||||
version: "1.0.0",
|
|
||||||
schemaURL: "https://schema.com",
|
|
||||||
attributes: scopeAttrs,
|
|
||||||
}
|
|
||||||
ts := uint64(time.Now().UnixNano())
|
ts := uint64(time.Now().UnixNano())
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
name string
|
name string
|
||||||
metric func() pmetric.Metric
|
metric func() pmetric.Metric
|
||||||
scope scope
|
want func() map[uint64]*prompb.TimeSeries
|
||||||
promoteScope bool
|
|
||||||
want func() map[uint64]*prompb.TimeSeries
|
|
||||||
}{
|
}{
|
||||||
{
|
{
|
||||||
name: "gauge without scope promotion",
|
name: "gauge",
|
||||||
metric: func() pmetric.Metric {
|
metric: func() pmetric.Metric {
|
||||||
return getIntGaugeMetric(
|
return getIntGaugeMetric(
|
||||||
"test",
|
"test",
|
||||||
@ -58,8 +45,6 @@ func TestPrometheusConverter_addGaugeNumberDataPoints(t *testing.T) {
|
|||||||
1, ts,
|
1, ts,
|
||||||
)
|
)
|
||||||
},
|
},
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: false,
|
|
||||||
want: func() map[uint64]*prompb.TimeSeries {
|
want: func() map[uint64]*prompb.TimeSeries {
|
||||||
labels := []prompb.Label{
|
labels := []prompb.Label{
|
||||||
{Name: model.MetricNameLabel, Value: "test"},
|
{Name: model.MetricNameLabel, Value: "test"},
|
||||||
@ -77,39 +62,6 @@ func TestPrometheusConverter_addGaugeNumberDataPoints(t *testing.T) {
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
|
||||||
name: "gauge with scope promotion",
|
|
||||||
metric: func() pmetric.Metric {
|
|
||||||
return getIntGaugeMetric(
|
|
||||||
"test",
|
|
||||||
pcommon.NewMap(),
|
|
||||||
1, ts,
|
|
||||||
)
|
|
||||||
},
|
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: true,
|
|
||||||
want: func() map[uint64]*prompb.TimeSeries {
|
|
||||||
labels := []prompb.Label{
|
|
||||||
{Name: model.MetricNameLabel, Value: "test"},
|
|
||||||
{Name: "otel_scope_name", Value: defaultScope.name},
|
|
||||||
{Name: "otel_scope_schema_url", Value: defaultScope.schemaURL},
|
|
||||||
{Name: "otel_scope_version", Value: defaultScope.version},
|
|
||||||
{Name: "otel_scope_attr1", Value: "value1"},
|
|
||||||
{Name: "otel_scope_attr2", Value: "value2"},
|
|
||||||
}
|
|
||||||
return map[uint64]*prompb.TimeSeries{
|
|
||||||
timeSeriesSignature(labels): {
|
|
||||||
Labels: labels,
|
|
||||||
Samples: []prompb.Sample{
|
|
||||||
{
|
|
||||||
Value: 1,
|
|
||||||
Timestamp: convertTimeStamp(pcommon.Timestamp(ts)),
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
}
|
|
||||||
},
|
|
||||||
},
|
|
||||||
}
|
}
|
||||||
for _, tt := range tests {
|
for _, tt := range tests {
|
||||||
t.Run(tt.name, func(t *testing.T) {
|
t.Run(tt.name, func(t *testing.T) {
|
||||||
@ -121,11 +73,9 @@ func TestPrometheusConverter_addGaugeNumberDataPoints(t *testing.T) {
|
|||||||
metric.Gauge().DataPoints(),
|
metric.Gauge().DataPoints(),
|
||||||
pcommon.NewResource(),
|
pcommon.NewResource(),
|
||||||
Settings{
|
Settings{
|
||||||
ExportCreatedMetric: true,
|
ExportCreatedMetric: true,
|
||||||
PromoteScopeMetadata: tt.promoteScope,
|
|
||||||
},
|
},
|
||||||
metric.Name(),
|
metric.Name(),
|
||||||
tt.scope,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
require.Equal(t, tt.want(), converter.unique)
|
require.Equal(t, tt.want(), converter.unique)
|
||||||
@ -135,27 +85,14 @@ func TestPrometheusConverter_addGaugeNumberDataPoints(t *testing.T) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func TestPrometheusConverter_addSumNumberDataPoints(t *testing.T) {
|
func TestPrometheusConverter_addSumNumberDataPoints(t *testing.T) {
|
||||||
scopeAttrs := pcommon.NewMap()
|
|
||||||
scopeAttrs.FromRaw(map[string]any{
|
|
||||||
"attr1": "value1",
|
|
||||||
"attr2": "value2",
|
|
||||||
})
|
|
||||||
defaultScope := scope{
|
|
||||||
name: "test-scope",
|
|
||||||
version: "1.0.0",
|
|
||||||
schemaURL: "https://schema.com",
|
|
||||||
attributes: scopeAttrs,
|
|
||||||
}
|
|
||||||
ts := pcommon.Timestamp(time.Now().UnixNano())
|
ts := pcommon.Timestamp(time.Now().UnixNano())
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
name string
|
name string
|
||||||
metric func() pmetric.Metric
|
metric func() pmetric.Metric
|
||||||
scope scope
|
want func() map[uint64]*prompb.TimeSeries
|
||||||
promoteScope bool
|
|
||||||
want func() map[uint64]*prompb.TimeSeries
|
|
||||||
}{
|
}{
|
||||||
{
|
{
|
||||||
name: "sum without scope promotion",
|
name: "sum",
|
||||||
metric: func() pmetric.Metric {
|
metric: func() pmetric.Metric {
|
||||||
return getIntSumMetric(
|
return getIntSumMetric(
|
||||||
"test",
|
"test",
|
||||||
@ -164,8 +101,6 @@ func TestPrometheusConverter_addSumNumberDataPoints(t *testing.T) {
|
|||||||
uint64(ts.AsTime().UnixNano()),
|
uint64(ts.AsTime().UnixNano()),
|
||||||
)
|
)
|
||||||
},
|
},
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: false,
|
|
||||||
want: func() map[uint64]*prompb.TimeSeries {
|
want: func() map[uint64]*prompb.TimeSeries {
|
||||||
labels := []prompb.Label{
|
labels := []prompb.Label{
|
||||||
{Name: model.MetricNameLabel, Value: "test"},
|
{Name: model.MetricNameLabel, Value: "test"},
|
||||||
@ -184,41 +119,7 @@ func TestPrometheusConverter_addSumNumberDataPoints(t *testing.T) {
|
|||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "sum with scope promotion",
|
name: "sum with exemplars",
|
||||||
metric: func() pmetric.Metric {
|
|
||||||
return getIntSumMetric(
|
|
||||||
"test",
|
|
||||||
pcommon.NewMap(),
|
|
||||||
1,
|
|
||||||
uint64(ts.AsTime().UnixNano()),
|
|
||||||
)
|
|
||||||
},
|
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: true,
|
|
||||||
want: func() map[uint64]*prompb.TimeSeries {
|
|
||||||
labels := []prompb.Label{
|
|
||||||
{Name: model.MetricNameLabel, Value: "test"},
|
|
||||||
{Name: "otel_scope_name", Value: defaultScope.name},
|
|
||||||
{Name: "otel_scope_schema_url", Value: defaultScope.schemaURL},
|
|
||||||
{Name: "otel_scope_version", Value: defaultScope.version},
|
|
||||||
{Name: "otel_scope_attr1", Value: "value1"},
|
|
||||||
{Name: "otel_scope_attr2", Value: "value2"},
|
|
||||||
}
|
|
||||||
return map[uint64]*prompb.TimeSeries{
|
|
||||||
timeSeriesSignature(labels): {
|
|
||||||
Labels: labels,
|
|
||||||
Samples: []prompb.Sample{
|
|
||||||
{
|
|
||||||
Value: 1,
|
|
||||||
Timestamp: convertTimeStamp(ts),
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
}
|
|
||||||
},
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: "sum with exemplars and without scope promotion",
|
|
||||||
metric: func() pmetric.Metric {
|
metric: func() pmetric.Metric {
|
||||||
m := getIntSumMetric(
|
m := getIntSumMetric(
|
||||||
"test",
|
"test",
|
||||||
@ -229,8 +130,6 @@ func TestPrometheusConverter_addSumNumberDataPoints(t *testing.T) {
|
|||||||
m.Sum().DataPoints().At(0).Exemplars().AppendEmpty().SetDoubleValue(2)
|
m.Sum().DataPoints().At(0).Exemplars().AppendEmpty().SetDoubleValue(2)
|
||||||
return m
|
return m
|
||||||
},
|
},
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: false,
|
|
||||||
want: func() map[uint64]*prompb.TimeSeries {
|
want: func() map[uint64]*prompb.TimeSeries {
|
||||||
labels := []prompb.Label{
|
labels := []prompb.Label{
|
||||||
{Name: model.MetricNameLabel, Value: "test"},
|
{Name: model.MetricNameLabel, Value: "test"},
|
||||||
@ -250,7 +149,7 @@ func TestPrometheusConverter_addSumNumberDataPoints(t *testing.T) {
|
|||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "monotonic cumulative sum with start timestamp and without scope promotion",
|
name: "monotonic cumulative sum with start timestamp",
|
||||||
metric: func() pmetric.Metric {
|
metric: func() pmetric.Metric {
|
||||||
metric := pmetric.NewMetric()
|
metric := pmetric.NewMetric()
|
||||||
metric.SetName("test_sum")
|
metric.SetName("test_sum")
|
||||||
@ -264,8 +163,6 @@ func TestPrometheusConverter_addSumNumberDataPoints(t *testing.T) {
|
|||||||
|
|
||||||
return metric
|
return metric
|
||||||
},
|
},
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: false,
|
|
||||||
want: func() map[uint64]*prompb.TimeSeries {
|
want: func() map[uint64]*prompb.TimeSeries {
|
||||||
labels := []prompb.Label{
|
labels := []prompb.Label{
|
||||||
{Name: model.MetricNameLabel, Value: "test_sum"},
|
{Name: model.MetricNameLabel, Value: "test_sum"},
|
||||||
@ -290,7 +187,7 @@ func TestPrometheusConverter_addSumNumberDataPoints(t *testing.T) {
|
|||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "monotonic cumulative sum with no start time and without scope promotion",
|
name: "monotonic cumulative sum with no start time",
|
||||||
metric: func() pmetric.Metric {
|
metric: func() pmetric.Metric {
|
||||||
metric := pmetric.NewMetric()
|
metric := pmetric.NewMetric()
|
||||||
metric.SetName("test_sum")
|
metric.SetName("test_sum")
|
||||||
@ -302,8 +199,6 @@ func TestPrometheusConverter_addSumNumberDataPoints(t *testing.T) {
|
|||||||
|
|
||||||
return metric
|
return metric
|
||||||
},
|
},
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: false,
|
|
||||||
want: func() map[uint64]*prompb.TimeSeries {
|
want: func() map[uint64]*prompb.TimeSeries {
|
||||||
labels := []prompb.Label{
|
labels := []prompb.Label{
|
||||||
{Name: model.MetricNameLabel, Value: "test_sum"},
|
{Name: model.MetricNameLabel, Value: "test_sum"},
|
||||||
@ -319,7 +214,7 @@ func TestPrometheusConverter_addSumNumberDataPoints(t *testing.T) {
|
|||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "non-monotonic cumulative sum with start time and without scope promotion",
|
name: "non-monotonic cumulative sum with start time",
|
||||||
metric: func() pmetric.Metric {
|
metric: func() pmetric.Metric {
|
||||||
metric := pmetric.NewMetric()
|
metric := pmetric.NewMetric()
|
||||||
metric.SetName("test_sum")
|
metric.SetName("test_sum")
|
||||||
@ -331,8 +226,6 @@ func TestPrometheusConverter_addSumNumberDataPoints(t *testing.T) {
|
|||||||
|
|
||||||
return metric
|
return metric
|
||||||
},
|
},
|
||||||
scope: defaultScope,
|
|
||||||
promoteScope: false,
|
|
||||||
want: func() map[uint64]*prompb.TimeSeries {
|
want: func() map[uint64]*prompb.TimeSeries {
|
||||||
labels := []prompb.Label{
|
labels := []prompb.Label{
|
||||||
{Name: model.MetricNameLabel, Value: "test_sum"},
|
{Name: model.MetricNameLabel, Value: "test_sum"},
|
||||||
@ -359,11 +252,9 @@ func TestPrometheusConverter_addSumNumberDataPoints(t *testing.T) {
|
|||||||
pcommon.NewResource(),
|
pcommon.NewResource(),
|
||||||
metric,
|
metric,
|
||||||
Settings{
|
Settings{
|
||||||
ExportCreatedMetric: true,
|
ExportCreatedMetric: true,
|
||||||
PromoteScopeMetadata: tt.promoteScope,
|
|
||||||
},
|
},
|
||||||
metric.Name(),
|
metric.Name(),
|
||||||
tt.scope,
|
|
||||||
)
|
)
|
||||||
|
|
||||||
require.Equal(t, tt.want(), converter.unique)
|
require.Equal(t, tt.want(), converter.unique)
|
||||||
|
@ -602,7 +602,6 @@ func (rw *rwExporter) ConsumeMetrics(ctx context.Context, md pmetric.Metrics) er
|
|||||||
KeepIdentifyingResourceAttributes: otlpCfg.KeepIdentifyingResourceAttributes,
|
KeepIdentifyingResourceAttributes: otlpCfg.KeepIdentifyingResourceAttributes,
|
||||||
ConvertHistogramsToNHCB: otlpCfg.ConvertHistogramsToNHCB,
|
ConvertHistogramsToNHCB: otlpCfg.ConvertHistogramsToNHCB,
|
||||||
AllowDeltaTemporality: rw.allowDeltaTemporality,
|
AllowDeltaTemporality: rw.allowDeltaTemporality,
|
||||||
PromoteScopeMetadata: otlpCfg.PromoteScopeMetadata,
|
|
||||||
LookbackDelta: rw.lookbackDelta,
|
LookbackDelta: rw.lookbackDelta,
|
||||||
})
|
})
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
{
|
{
|
||||||
"name": "@prometheus-io/mantine-ui",
|
"name": "@prometheus-io/mantine-ui",
|
||||||
"private": true,
|
"private": true,
|
||||||
"version": "0.304.2",
|
"version": "0.305.0",
|
||||||
"type": "module",
|
"type": "module",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"start": "vite",
|
"start": "vite",
|
||||||
@ -28,7 +28,7 @@
|
|||||||
"@microsoft/fetch-event-source": "^2.0.1",
|
"@microsoft/fetch-event-source": "^2.0.1",
|
||||||
"@nexucis/fuzzy": "^0.5.1",
|
"@nexucis/fuzzy": "^0.5.1",
|
||||||
"@nexucis/kvsearch": "^0.9.1",
|
"@nexucis/kvsearch": "^0.9.1",
|
||||||
"@prometheus-io/codemirror-promql": "0.304.2",
|
"@prometheus-io/codemirror-promql": "0.305.0",
|
||||||
"@reduxjs/toolkit": "^2.7.0",
|
"@reduxjs/toolkit": "^2.7.0",
|
||||||
"@tabler/icons-react": "^3.31.0",
|
"@tabler/icons-react": "^3.31.0",
|
||||||
"@tanstack/react-query": "^5.74.7",
|
"@tanstack/react-query": "^5.74.7",
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "@prometheus-io/codemirror-promql",
|
"name": "@prometheus-io/codemirror-promql",
|
||||||
"version": "0.304.2",
|
"version": "0.305.0",
|
||||||
"description": "a CodeMirror mode for the PromQL language",
|
"description": "a CodeMirror mode for the PromQL language",
|
||||||
"types": "dist/esm/index.d.ts",
|
"types": "dist/esm/index.d.ts",
|
||||||
"module": "dist/esm/index.js",
|
"module": "dist/esm/index.js",
|
||||||
@ -29,7 +29,7 @@
|
|||||||
},
|
},
|
||||||
"homepage": "https://github.com/prometheus/prometheus/blob/main/web/ui/module/codemirror-promql/README.md",
|
"homepage": "https://github.com/prometheus/prometheus/blob/main/web/ui/module/codemirror-promql/README.md",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@prometheus-io/lezer-promql": "0.304.2",
|
"@prometheus-io/lezer-promql": "0.305.0",
|
||||||
"lru-cache": "^11.1.0"
|
"lru-cache": "^11.1.0"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "@prometheus-io/lezer-promql",
|
"name": "@prometheus-io/lezer-promql",
|
||||||
"version": "0.304.2",
|
"version": "0.305.0",
|
||||||
"description": "lezer-based PromQL grammar",
|
"description": "lezer-based PromQL grammar",
|
||||||
"main": "dist/index.cjs",
|
"main": "dist/index.cjs",
|
||||||
"type": "module",
|
"type": "module",
|
||||||
|
14
web/ui/package-lock.json
generated
14
web/ui/package-lock.json
generated
@ -1,12 +1,12 @@
|
|||||||
{
|
{
|
||||||
"name": "prometheus-io",
|
"name": "prometheus-io",
|
||||||
"version": "0.304.2",
|
"version": "0.305.0",
|
||||||
"lockfileVersion": 3,
|
"lockfileVersion": 3,
|
||||||
"requires": true,
|
"requires": true,
|
||||||
"packages": {
|
"packages": {
|
||||||
"": {
|
"": {
|
||||||
"name": "prometheus-io",
|
"name": "prometheus-io",
|
||||||
"version": "0.304.2",
|
"version": "0.305.0",
|
||||||
"workspaces": [
|
"workspaces": [
|
||||||
"mantine-ui",
|
"mantine-ui",
|
||||||
"module/*"
|
"module/*"
|
||||||
@ -24,7 +24,7 @@
|
|||||||
},
|
},
|
||||||
"mantine-ui": {
|
"mantine-ui": {
|
||||||
"name": "@prometheus-io/mantine-ui",
|
"name": "@prometheus-io/mantine-ui",
|
||||||
"version": "0.304.2",
|
"version": "0.305.0",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@codemirror/autocomplete": "^6.18.6",
|
"@codemirror/autocomplete": "^6.18.6",
|
||||||
"@codemirror/language": "^6.11.0",
|
"@codemirror/language": "^6.11.0",
|
||||||
@ -42,7 +42,7 @@
|
|||||||
"@microsoft/fetch-event-source": "^2.0.1",
|
"@microsoft/fetch-event-source": "^2.0.1",
|
||||||
"@nexucis/fuzzy": "^0.5.1",
|
"@nexucis/fuzzy": "^0.5.1",
|
||||||
"@nexucis/kvsearch": "^0.9.1",
|
"@nexucis/kvsearch": "^0.9.1",
|
||||||
"@prometheus-io/codemirror-promql": "0.304.2",
|
"@prometheus-io/codemirror-promql": "0.305.0",
|
||||||
"@reduxjs/toolkit": "^2.7.0",
|
"@reduxjs/toolkit": "^2.7.0",
|
||||||
"@tabler/icons-react": "^3.31.0",
|
"@tabler/icons-react": "^3.31.0",
|
||||||
"@tanstack/react-query": "^5.74.7",
|
"@tanstack/react-query": "^5.74.7",
|
||||||
@ -189,10 +189,10 @@
|
|||||||
},
|
},
|
||||||
"module/codemirror-promql": {
|
"module/codemirror-promql": {
|
||||||
"name": "@prometheus-io/codemirror-promql",
|
"name": "@prometheus-io/codemirror-promql",
|
||||||
"version": "0.304.2",
|
"version": "0.305.0",
|
||||||
"license": "Apache-2.0",
|
"license": "Apache-2.0",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@prometheus-io/lezer-promql": "0.304.2",
|
"@prometheus-io/lezer-promql": "0.305.0",
|
||||||
"lru-cache": "^11.1.0"
|
"lru-cache": "^11.1.0"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
@ -222,7 +222,7 @@
|
|||||||
},
|
},
|
||||||
"module/lezer-promql": {
|
"module/lezer-promql": {
|
||||||
"name": "@prometheus-io/lezer-promql",
|
"name": "@prometheus-io/lezer-promql",
|
||||||
"version": "0.304.2",
|
"version": "0.305.0",
|
||||||
"license": "Apache-2.0",
|
"license": "Apache-2.0",
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@lezer/generator": "^1.7.3",
|
"@lezer/generator": "^1.7.3",
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
{
|
{
|
||||||
"name": "prometheus-io",
|
"name": "prometheus-io",
|
||||||
"description": "Monorepo for the Prometheus UI",
|
"description": "Monorepo for the Prometheus UI",
|
||||||
"version": "0.304.2",
|
"version": "0.305.0",
|
||||||
"private": true,
|
"private": true,
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"build": "bash build_ui.sh --all",
|
"build": "bash build_ui.sh --all",
|
||||||
|
Loading…
Reference in New Issue
Block a user