diff options
author | Jon Leech <oddhack@sonic.net> | 2024-05-05 21:55:17 -0700 |
---|---|---|
committer | Jon Leech <oddhack@sonic.net> | 2024-05-05 21:55:17 -0700 |
commit | ff188a8ffa950a010422c4c4b277f06db10d0dce (patch) | |
tree | 11a5f2ec5ed4062f8eb6b64e963520e81248c7b6 | |
parent | 8a8a32f67d08e0df011a4406135b0ec7047d90e6 (diff) | |
download | gfxstream-protocols-upstream-main.tar.gz |
Change log for May 5, 2024 Vulkan 1.3.284 spec update:upstream-main
Public Issues
* Refactor "`proposals`" into a separate Antora component "`features`" and
refer to published proposals more consistently as "`feature
descriptions`" (public PR 2361).
Internal Issues
* Partial synchronization with OpenXR scripts (internal MR 6419).
* Refactor extensionmetadocgenerator.py to simplify adding new sections
(internal MR 6624).
* Restore structextends="VkPhysicalDeviceProperties2" back for
VkPhysicalDeviceLegacyVertexAttributesPropertiesEXT (internal MR 6631).
New Extensions
* VK_EXT_legacy_vertex_attributes
32 files changed, 609 insertions, 233 deletions
diff --git a/ChangeLog.adoc b/ChangeLog.adoc index e2f4c22a..a4591872 100644 --- a/ChangeLog.adoc +++ b/ChangeLog.adoc @@ -14,6 +14,28 @@ appears frequently in the change log. ''' +Change log for May 5, 2024 Vulkan 1.3.284 spec update: + +Public Issues + + * Refactor "`proposals`" into a separate Antora component "`features`" and + refer to published proposals more consistently as "`feature + descriptions`" (public PR 2361). + +Internal Issues + + * Partial synchronization with OpenXR scripts (internal MR 6419). + * Refactor extensionmetadocgenerator.py to simplify adding new sections + (internal MR 6624). + * Restore structextends="VkPhysicalDeviceProperties2" back for + VkPhysicalDeviceLegacyVertexAttributesPropertiesEXT (internal MR 6631). + +New Extensions + + * VK_EXT_legacy_vertex_attributes + +''' + Change log for April 19, 2024 Vulkan 1.3.283 spec update: Public Issues @@ -138,7 +138,7 @@ VERBOSE = # ADOCOPTS options for asciidoc->HTML5 output NOTEOPTS = -a editing-notes -a implementation-guide -PATCHVERSION = 283 +PATCHVERSION = 284 BASEOPTS = ifneq (,$(findstring VKSC_VERSION_1_0,$(VERSIONS))) diff --git a/appendices/VK_EXT_legacy_vertex_attributes.adoc b/appendices/VK_EXT_legacy_vertex_attributes.adoc new file mode 100644 index 00000000..1c5c062f --- /dev/null +++ b/appendices/VK_EXT_legacy_vertex_attributes.adoc @@ -0,0 +1,47 @@ +// Copyright 2023-2024 The Khronos Group Inc. +// +// SPDX-License-Identifier: CC-BY-4.0 + +include::{generated}/meta/{refprefix}VK_EXT_legacy_vertex_attributes.adoc[] + +=== Other Extension Metadata + +*Last Modified Date*:: + 2024-02-23 +*IP Status*:: + No known IP claims. +*Contributors*:: + - Mike Blumenkrantz, Valve + - Piers Daniell, NVIDIA + - Spencer Fricke, LunarG + - Alyssa Rosenzweig, Valve + +=== Description + +This extension adds support for legacy features of (non-64-bit) vertex +attributes as found in OpenGL: + + - Vertex attributes loaded from arbitrary buffer alignments + - Vertex attributes using arbitrary strides + - Vertex attributes where the component data type of the binding does not + match the component numeric type of the shader input + +These features are only usable with dynamic vertex input. +Unaligned loads of vertex attributes may incur performance penalties, +indicated with a property. + +include::{generated}/interfaces/VK_EXT_legacy_vertex_attributes.adoc[] + +=== Issues + +1.) Should implementations convert float/integer values? + +*RESOLVED*: No. +When fetching an integer data type from float values or float data types +from integer values, the resulting shader values are +implementation-dependent. + +=== Version History + + * Revision 1, 2024-02-16 (Mike Blumenkrantz) + ** Initial revision diff --git a/chapters/commonvalidity/draw_vertex_binding.adoc b/chapters/commonvalidity/draw_vertex_binding.adoc index 889d05cf..05d0c834 100644 --- a/chapters/commonvalidity/draw_vertex_binding.adoc +++ b/chapters/commonvalidity/draw_vertex_binding.adoc @@ -102,6 +102,13 @@ ifdef::VK_EXT_vertex_input_dynamic_state[] the bound graphics pipeline state was created with the ename:VK_DYNAMIC_STATE_VERTEX_INPUT_EXT dynamic state enabled endif::VK_EXT_vertex_input_dynamic_state[] +ifdef::VK_EXT_legacy_vertex_attributes[] + and either <<features-legacyVertexAttributes, + pname:legacyVertexAttributes>> is not enabled or the SPIR-V Type + associated with a given code:Input variable of the corresponding + code:Location in the code:Vertex {ExecutionModel} code:OpEntryPoint is + 64-bit, +endif::VK_EXT_legacy_vertex_attributes[] then the numeric type associated with all code:Input variables of the corresponding code:Location in the code:Vertex {ExecutionModel} code:OpEntryPoint must: be the same as diff --git a/chapters/features.adoc b/chapters/features.adoc index 0044480f..b105189e 100644 --- a/chapters/features.adoc +++ b/chapters/features.adoc @@ -3804,6 +3804,30 @@ include::{generated}/validity/structs/VkPhysicalDeviceAttachmentFeedbackLoopDyna -- endif::VK_EXT_attachment_feedback_loop_dynamic_state[] +ifdef::VK_EXT_legacy_vertex_attributes[] +[open,refpage='VkPhysicalDeviceLegacyVertexAttributesFeaturesEXT',desc='Structure describing compatibility features for vertex attributes',type='structs'] +-- +The sname:VkPhysicalDeviceLegacyVertexAttributesFeaturesEXT structure is +defined as: + +include::{generated}/api/structs/VkPhysicalDeviceLegacyVertexAttributesFeaturesEXT.adoc[] + +This structure describes the following features: + + * pname:sType is a elink:VkStructureType value identifying this structure. + * pname:pNext is `NULL` or a pointer to a structure extending this + structure. + * [[features-legacyVertexAttributes]] pname:legacyVertexAttributes + specifies whether compatibility features for vertex attributes are + supported when using dynamic vertex input state. + +:refpage: VkPhysicalDeviceLegacyVertexAttributesFeaturesEXT +include::{chapters}/features.adoc[tag=features] + +include::{generated}/validity/structs/VkPhysicalDeviceLegacyVertexAttributesFeaturesEXT.adoc[] +-- +endif::VK_EXT_legacy_vertex_attributes[] + ifdef::VK_VERSION_1_3,VK_EXT_texture_compression_astc_hdr[] [open,refpage='VkPhysicalDeviceTextureCompressionASTCHDRFeatures',desc='Structure describing ASTC HDR features that can be supported by an implementation',type='structs',alias='VkPhysicalDeviceTextureCompressionASTCHDRFeaturesEXT'] -- @@ -8449,6 +8473,10 @@ ifdef::VK_EXT_attachment_feedback_loop_dynamic_state[] `apiext:VK_EXT_attachment_feedback_loop_dynamic_state` extension is supported. endif::VK_EXT_attachment_feedback_loop_dynamic_state[] +ifdef::VK_EXT_legacy_vertex_attributes[] + * <<features-legacyVertexAttributes, pname:legacyVertexAttributes>>, if + the `apiext:VK_EXT_legacy_vertex_attributes` extension is supported. +endif::VK_EXT_legacy_vertex_attributes[] ifdef::VK_KHR_ray_tracing_position_fetch[] * <<features-rayTracingPositionFetch, pname:rayTracingPositionFetch>>, if the `apiext:VK_KHR_ray_tracing_position_fetch` extension is supported. diff --git a/chapters/fxvertex.adoc b/chapters/fxvertex.adoc index 0ee3b109..0f7e61ea 100644 --- a/chapters/fxvertex.adoc +++ b/chapters/fxvertex.adoc @@ -978,11 +978,22 @@ The numeric type of pname:format must: match the numeric type of the input variable in the shader. The input variable in the shader must: be declared as a 64-bit data type if and only if pname:format is a 64-bit data type. -If pname:format is a packed format, `attribAddress` must: be a multiple of -the size in bytes of the whole attribute data type as described in +If +ifdef::VK_EXT_legacy_vertex_attributes[] +either pname:format is a 64-bit format or <<features-legacyVertexAttributes, +pname:legacyVertexAttributes>> is not enabled, and +endif::VK_EXT_legacy_vertex_attributes[] +pname:format is a packed format, `attribAddress` must: be a multiple of the +size in bytes of the whole attribute data type as described in <<formats-packed,Packed Formats>>. -Otherwise, `attribAddress` must: be a multiple of the size in bytes of the -component type indicated by pname:format (see <<formats,Formats>>). +Otherwise, +ifdef::VK_EXT_legacy_vertex_attributes[] +if either pname:format is a 64-bit format or +<<features-legacyVertexAttributes, pname:legacyVertexAttributes>> is not +enabled, +endif::VK_EXT_legacy_vertex_attributes[] +`attribAddress` must: be a multiple of the size in bytes of the component +type indicated by pname:format (see <<formats,Formats>>). For attributes that are not 64-bit data types, each component is converted to the format of the input variable based on its type and size (as defined in the <<formats-definition,Format Definition>> section for each diff --git a/chapters/limits.adoc b/chapters/limits.adoc index 663ae607..ac8d69a3 100644 --- a/chapters/limits.adoc +++ b/chapters/limits.adoc @@ -1166,6 +1166,30 @@ include::{generated}/validity/structs/VkPhysicalDeviceSampleLocationsPropertiesE -- endif::VK_EXT_sample_locations[] +ifdef::VK_EXT_legacy_vertex_attributes[] +[open,refpage='VkPhysicalDeviceLegacyVertexAttributesPropertiesEXT',desc='Structure describing properties for legacy vertex attributes',type='structs'] +-- +The sname:VkPhysicalDeviceLegacyVertexAttributesPropertiesEXT structure is +defined as: + +include::{generated}/api/structs/VkPhysicalDeviceLegacyVertexAttributesPropertiesEXT.adoc[] + +This structure describes the following features: + + * pname:sType is a elink:VkStructureType value identifying this structure. + * pname:pNext is `NULL` or a pointer to a structure extending this + structure. + * [[limits-nativeUnalignedPerformance]] pname:nativeUnalignedPerformance + specifies whether unaligned vertex fetches do not incur significant + performance penalties as compared to aligned fetches. + +:refpage: VkPhysicalDeviceLegacyVertexAttributesPropertiesEXT +include::{chapters}/features.adoc[tag=features] + +include::{generated}/validity/structs/VkPhysicalDeviceLegacyVertexAttributesPropertiesEXT.adoc[] +-- +endif::VK_EXT_legacy_vertex_attributes[] + ifdef::VK_EXT_external_memory_host[] [open,refpage='VkPhysicalDeviceExternalMemoryHostPropertiesEXT',desc='Structure describing external memory host pointer limits that can be supported by an implementation',type='structs'] -- @@ -5163,6 +5187,9 @@ ifdef::VK_EXT_sample_locations[] | pname:sampleLocationSubPixelBits | - | 4 | min | pname:variableSampleLocations | - |false| implementation-dependent endif::VK_EXT_sample_locations[] +ifdef::VK_EXT_legacy_vertex_attributes[] +| pname:nativeUnalignedPerformance | - |false| implementation-dependent +endif::VK_EXT_legacy_vertex_attributes[] ifdef::VK_EXT_external_memory_host[] | pname:minImportedHostPointerAlignment | - | 65536 | max endif::VK_EXT_external_memory_host[] diff --git a/proposals/VK_EXT_legacy_vertex_attributes.adoc b/proposals/VK_EXT_legacy_vertex_attributes.adoc new file mode 100644 index 00000000..0941f86a --- /dev/null +++ b/proposals/VK_EXT_legacy_vertex_attributes.adoc @@ -0,0 +1,73 @@ +// Copyright 2024 The Khronos Group Inc. +// +// SPDX-License-Identifier: CC-BY-4.0 + +# VK_EXT_legacy_vertex_attributes +:toc: left +:refpage: https://registry.khronos.org/vulkan/specs/1.3-extensions/man/html/ +:sectnums: + +This document proposes adding legacy features for vertex attributes as found in OpenGL. + +## Problem Statement + +OpenGL allows three features that Vulkan explicitly prohibits: + + - Vertex attributes loaded from arbitrary buffer alignments + - Vertex attributes using arbitrary strides + - Vertex attributes where the component data type of the binding does not match the component numeric type of the shader input + +This proposal aims to provide this legacy functionality for non-64-bit attributes. + + +## Solution Space + +These legacy features can be emulated by rewriting vertex buffers and generating shader variants. Neither option +is as optimal as having the underlying driver handle the functionality, where it may be a no-op. + +## Proposal + +### API Features + +The following features are exposed by this extension: + +[source,c] +---- +typedef struct VkPhysicalDeviceLegacyVertexAttributesFeaturesEXT { + VkStructureType sType; + void* pNext; + VkBool32 legacyVertexAttributes; +} VkPhysicalDeviceLegacyVertexAttributesFeaturesEXT; +---- + +`legacyVertexAttributes` is the core feature enabling this extension's functionality. + +The following properties are exposed by this extension: + +[source,c] +---- +typedef struct VkPhysicalDeviceLegacyVertexAttributesPropertiesEXT { + VkStructureType sType; + void* pNext; + VkBool32 nativeUnalignedPerformance; +} VkPhysicalDeviceLegacyVertexAttributesPropertiesEXT; +---- + +`nativeUnalignedPerformance` indicates that using unaligned vertex fetches on this implementation will not incur significant performance penalties. + +## Examples + +Enabling this feature allows the following example scenarios for a user with dynamic vertex input active: + + - Binding a vertex buffer at offset=7 + - Binding a VK_FORMAT_R32_UINT attribute with stride=1 + - Binding a VK_FORMAT_R8_UINT attribute and reading it as signed `int` in a shader + +## Issues + + +### RESOLVED: Should implementations convert float/integer values? + +No. When fetching an integer data type from float values or float +data types from integer values, the resulting shader values are +implementation-dependent. diff --git a/scripts/cgenerator.py b/scripts/cgenerator.py index f86658ee..8192c182 100644 --- a/scripts/cgenerator.py +++ b/scripts/cgenerator.py @@ -243,7 +243,7 @@ class COutputGenerator(OutputGenerator): if self.genOpts.conventions is None: raise MissingGeneratorOptionsConventionsError() is_core = self.featureName and self.featureName.startswith(self.conventions.api_prefix + 'VERSION_') - if self.genOpts.conventions.writeFeature(self.featureExtraProtect, self.genOpts.filename): + if self.genOpts.conventions.writeFeature(self.featureName, self.featureExtraProtect, self.genOpts.filename): self.newline() if self.genOpts.protectFeature: write('#ifndef', self.featureName, file=self.outFile) @@ -507,7 +507,7 @@ class COutputGenerator(OutputGenerator): self.appendSection('commandPointer', decls[1]) def misracstyle(self): - return self.genOpts.misracstyle; + return self.genOpts.misracstyle def misracppstyle(self): - return self.genOpts.misracppstyle; + return self.genOpts.misracppstyle diff --git a/scripts/check_spec_links.py b/scripts/check_spec_links.py index e5ca207f..ea388aa0 100755 --- a/scripts/check_spec_links.py +++ b/scripts/check_spec_links.py @@ -19,6 +19,7 @@ from spec_tools.macro_checker_file import MacroCheckerFile from spec_tools.main import checkerMain from spec_tools.shared import (AUTO_FIX_STRING, EXTENSION_CATEGORY, MessageId, MessageType) +from apiconventions import APIConventions ### # "Configuration" constants @@ -50,6 +51,33 @@ SYSTEM_TYPES = set(('void', 'char', 'float', 'size_t', 'uintptr_t', 'int32_t', 'uint32_t', 'int64_t', 'uint64_t')) +# Exceptions to: +# error: Definition of link target {} with macro etext (used for category enums) does not exist. (-Wwrong_macro) +# typically caused by using Vulkan-only enums in Vulkan SC blocks with "etext", or because they +# are suffixed differently. +CHECK_UNRECOGNIZED_ETEXT_EXCEPTIONS = ( + "VK_COLORSPACE_SRGB_NONLINEAR_KHR", + "VK_COLOR_SPACE_DCI_P3_LINEAR_EXT", + "VK_PIPELINE_CACHE_CREATE_READ_ONLY_BIT", + "VK_STENCIL_FRONT_AND_BACK", + "VK_STRUCTURE_TYPE_PHYSICAL_DEVICE_SHADER_DRAW_PARAMETER_FEATURES", + "VK_STRUCTURE_TYPE_PHYSICAL_DEVICE_VARIABLE_POINTER_FEATURES", + "VK_STRUCTURE_TYPE_SURFACE_CAPABILITIES2_EXT", +) + +# Exceptions to: +# warning: Definition of link target {} with macro ename (used for category enums) does not exist. (-Wbad_enumerant) +# typically caused by Vulkan SC enums not being recognized in Vulkan build +CHECK_UNRECOGNIZED_ENAME_EXCEPTIONS = ( + "VK_ERROR_INVALID_PIPELINE_CACHE_DATA", + "VK_ERROR_NO_PIPELINE_MATCH", + "VK_ERROR_VALIDATION_FAILED", + "VK_MEMORY_HEAP_SEU_SAFE_BIT", + "VK_PIPELINE_CACHE_CREATE_READ_ONLY_BIT", + "VK_PIPELINE_CACHE_CREATE_USE_APPLICATION_STORAGE_BIT", + "VK_PIPELINE_CACHE_HEADER_VERSION_SAFETY_CRITICAL_ONE", +) + ROOT = Path(__file__).resolve().parent.parent DEFAULT_DISABLED_MESSAGES = set(( MessageId.LEGACY, @@ -121,6 +149,12 @@ class VulkanMacroCheckerFile(MacroCheckerFile): return type != 'builtins' and type != 'spirv' + def shouldSkipUnrecognizedEntity(self, macro, entity_name): + """Return True if we should not warn about not recognizing a macro invocation for entity_name.""" + if macro == "ename": + return entity_name in CHECK_UNRECOGNIZED_ENAME_EXCEPTIONS + return entity_name in CHECK_UNRECOGNIZED_ETEXT_EXCEPTIONS + def handleWrongMacro(self, msg, data): """Report an appropriate message when we found that the macro used is incorrect. @@ -157,15 +191,13 @@ class VulkanMacroCheckerFile(MacroCheckerFile): def allowEnumXrefs(self): """Returns True if enums can be specified in the 'xrefs' attribute of a refpage. - - Overrides base class behavior. OpenXR does not allow this. """ return True def makeMacroChecker(enabled_messages): """Create a correctly-configured MacroChecker instance.""" entity_db = VulkanEntityDatabase() - return MacroChecker(enabled_messages, entity_db, VulkanMacroCheckerFile, ROOT) + return MacroChecker(enabled_messages, entity_db, VulkanMacroCheckerFile, ROOT, APIConventions) if __name__ == '__main__': diff --git a/scripts/docgenerator.py b/scripts/docgenerator.py index b714ef7c..c013d350 100644 --- a/scripts/docgenerator.py +++ b/scripts/docgenerator.py @@ -35,14 +35,12 @@ def orgLevelKey(name): 'VK_KHR_', 'VK_EXT_') - i = 0 - for prefix in prefixes: + for index, prefix in enumerate(prefixes): if name.startswith(prefix): - return i - i += 1 + return index # Everything else (e.g. vendor extensions) is least important - return i + return len(prefixes) class DocGeneratorOptions(GeneratorOptions): @@ -232,12 +230,12 @@ class DocOutputGenerator(OutputGenerator): - basename - base name of the file - contents - contents of the file (Asciidoc boilerplate aside)""" # Create subdirectory, if needed - directory = self.genOpts.directory + '/' + directory + directory = Path(self.genOpts.directory) / directory self.makeDir(directory) # Create file - filename = directory + '/' + basename + self.file_suffix - self.logMsg('diag', '# Generating include file:', filename) + filename = directory / (f"{basename}{self.file_suffix}") + self.logMsg('diag', '# Generating include file:', str(filename)) fp = open(filename, 'w', encoding='utf-8') # Asciidoc anchor @@ -267,7 +265,7 @@ class DocOutputGenerator(OutputGenerator): if self.genOpts.secondaryInclude: # Create secondary no cross-reference include file - filename = f'{directory}/{basename}.no-xref{self.file_suffix}' + filename = directory / f'{basename}.no-xref{self.file_suffix}' self.logMsg('diag', '# Generating include file:', filename) fp = open(filename, 'w', encoding='utf-8') @@ -283,7 +281,7 @@ class DocOutputGenerator(OutputGenerator): def writeEnumTable(self, basename, values): """Output a table of enumerants.""" directory = Path(self.genOpts.directory) / 'enums' - self.makeDir(str(directory)) + self.makeDir(directory) filename = str(directory / f'{basename}.comments{self.file_suffix}') self.logMsg('diag', '# Generating include file:', filename) @@ -349,6 +347,9 @@ class DocOutputGenerator(OutputGenerator): name, category)) else: body = self.genRequirements(name) + # This is not appropriate for Vulkan + # if category in ('define',): + # body = body.strip() if alias: # If the type is an alias, just emit a typedef declaration body += 'typedef ' + alias + ' ' + name + ';\n' diff --git a/scripts/extensionmetadocgenerator.py b/scripts/extensionmetadocgenerator.py index a200bab9..883bafc9 100644 --- a/scripts/extensionmetadocgenerator.py +++ b/scripts/extensionmetadocgenerator.py @@ -7,6 +7,8 @@ import os import re import sys +from pathlib import Path + from functools import total_ordering from generator import GeneratorOptions, OutputGenerator, regSortFeatures, write from parse_dependency import dependencyMarkup, dependencyNames @@ -146,36 +148,36 @@ class Extension: if isRefpage: # Always link into API spec specURL = self.conventions.specURL('api') - return 'link:{}#{}[{}^]'.format(specURL, xrefName, xrefText) + return f'link:{specURL}#{xrefName}[{xrefText}^]' else: - return '<<' + xrefName + ', ' + xrefText + '>>' + return f'<<{xrefName}, {xrefText}>>' def conditionalLinkCoreAPI(self, apiVersion, linkSuffix, isRefpage): versionMatch = re.match(self.conventions.api_version_prefix + r'(\d+)_(\d+)', apiVersion) major = versionMatch.group(1) minor = versionMatch.group(2) - dottedVersion = major + '.' + minor + dottedVersion = f'{major}.{minor}' - xrefName = 'versions-' + dottedVersion + linkSuffix + xrefName = f'versions-{dottedVersion}{linkSuffix}' xrefText = self.conventions.api_name() + ' ' + dottedVersion - doc = 'ifdef::' + apiVersion + '[]\n' - doc += ' ' + self.specLink(xrefName, xrefText, isRefpage) + '\n' - doc += 'endif::' + apiVersion + '[]\n' - doc += 'ifndef::' + apiVersion + '[]\n' - doc += ' ' + self.conventions.api_name() + ' ' + dottedVersion + '\n' - doc += 'endif::' + apiVersion + '[]\n' + doc = f'ifdef::{apiVersion}[]\n' + doc += f' {self.specLink(xrefName, xrefText, isRefpage)}\n' + doc += f'endif::{apiVersion}[]\n' + doc += f'ifndef::{apiVersion}[]\n' + doc += f' {self.conventions.api_name()} {dottedVersion}\n' + doc += f'endif::{apiVersion}[]\n' return doc def conditionalLinkExt(self, extName, indent = ' '): - doc = 'ifdef::' + extName + '[]\n' - doc += indent + self.conventions.formatExtension(extName) + '\n' - doc += 'endif::' + extName + '[]\n' - doc += 'ifndef::' + extName + '[]\n' - doc += indent + '`' + extName + '`\n' - doc += 'endif::' + extName + '[]\n' + doc = f'ifdef::{extName}[]\n' + doc += f'{indent}{self.conventions.formatExtension(extName)}\n' + doc += f'endif::{extName}[]\n' + doc += f'ifndef::{extName}[]\n' + doc += f'{indent}`{extName}`\n' + doc += f'endif::{extName}[]\n' return doc @@ -257,18 +259,18 @@ class Extension: generating a specification extension appendix include""" if isRefpage: - filename = self.filename.replace('meta/', 'meta/refpage.') + filename = self.filename.with_name(f"refpage.{self.filename.name}") else: filename = self.filename fp = self.generator.newFile(filename) if not isRefpage: - write('[[' + self.name + ']]', file=fp) - write('== ' + self.name, file=fp) + write(f'[[{self.name}]]', file=fp) + write(f'== {self.name}', file=fp) write('', file=fp) - self.writeTag('Name String', '`' + self.name + '`', isRefpage, fp) + self.writeTag('Name String', f'`{self.name}`', isRefpage, fp) if self.conventions.write_extension_type: self.writeTag('Extension Type', self.typeToStr(), isRefpage, fp) @@ -406,7 +408,7 @@ class Extension: else: prettyHandle = handle - write(' * ' + name + ' ' + prettyHandle, file=fp) + write(f' * {name} {prettyHandle}', file=fp) write('', file=fp) # Check if a proposal document for this extension exists in the @@ -423,7 +425,7 @@ class Extension: """Check if a proposal document for an extension exists, returning the path to that proposal or None otherwise.""" - path = 'proposals/{}.adoc'.format(extname) + path = f'proposals/{extname}.adoc' if os.path.exists(path) and os.access(path, os.R_OK): return path else: @@ -494,8 +496,8 @@ class ExtensionMetaDocOutputGenerator(OutputGenerator): def beginFile(self, genOpts): OutputGenerator.beginFile(self, genOpts) - - self.directory = self.genOpts.directory + assert self.genOpts + self.directory = Path(self.genOpts.directory) self.file_suffix = self.genOpts.conventions.file_suffix # Iterate over all 'tag' Elements and add the names of all the valid vendor @@ -532,23 +534,23 @@ class ExtensionMetaDocOutputGenerator(OutputGenerator): def conditionalExt(self, extName, content, ifdef = None, condition = None): doc = '' - innerdoc = 'ifdef::' + extName + '[]\n' + innerdoc = f'ifdef::{extName}[]\n' innerdoc += content + '\n' - innerdoc += 'endif::' + extName + '[]\n' + innerdoc += f'endif::{extName}[]\n' if ifdef: if ifdef == 'ifndef': if condition: - doc += 'ifndef::' + condition + '[]\n' + doc += f'ifndef::{condition}[]\n' doc += innerdoc - doc += 'endif::' + condition + '[]\n' + doc += f'endif::{condition}[]\n' else: # no condition is as if condition is defined; "nothing" is always defined :p pass # so no output elif ifdef == 'ifdef': if condition: - doc += 'ifdef::' + condition + '+' + extName + '[]\n' + doc += f'ifdef::{condition}+{extName}[]\n' doc += content + '\n' # does not include innerdoc; the ifdef was merged with the one above - doc += 'endif::' + condition + '+' + extName + '[]\n' + doc += f'endif::{condition}+{extName}[]\n' else: # no condition is as if condition is defined; "nothing" is always defined :p doc += innerdoc else: # should be unreachable @@ -598,9 +600,10 @@ class ExtensionMetaDocOutputGenerator(OutputGenerator): promotedExtensions.setdefault(ext.supercedingAPIVersion, []).append(ext.name) for coreVersion, extensions in promotedExtensions.items(): - promoted_extensions_fp = self.newFile(self.directory + '/promoted_extensions_' + coreVersion + self.file_suffix) + promoted_extensions_fp = self.newFile(self.directory / f"promoted_extensions_{coreVersion}{self.file_suffix}") for extname in sorted(extensions, key=makeSortKey): + ext = self.extensions[extname] indent = '' write(' * {blank}\n+\n' + ext.conditionalLinkExt(extname, indent), file=promoted_extensions_fp) @@ -608,17 +611,17 @@ class ExtensionMetaDocOutputGenerator(OutputGenerator): # Generate include directives for the extensions appendix, grouping # extensions by status (current, deprecated, provisional, etc.) - with self.newFile(self.directory + '/current_extensions_appendix' + self.file_suffix) as current_extensions_appendix_fp, \ - self.newFile(self.directory + '/deprecated_extensions_appendix' + self.file_suffix) as deprecated_extensions_appendix_fp, \ - self.newFile(self.directory + '/current_extension_appendices' + self.file_suffix) as current_extension_appendices_fp, \ - self.newFile(self.directory + '/current_extension_appendices_toc' + self.file_suffix) as current_extension_appendices_toc_fp, \ - self.newFile(self.directory + '/deprecated_extension_appendices' + self.file_suffix) as deprecated_extension_appendices_fp, \ - self.newFile(self.directory + '/deprecated_extension_appendices_toc' + self.file_suffix) as deprecated_extension_appendices_toc_fp, \ - self.newFile(self.directory + '/deprecated_extensions_guard_macro' + self.file_suffix) as deprecated_extensions_guard_macro_fp, \ - self.newFile(self.directory + '/provisional_extensions_appendix' + self.file_suffix) as provisional_extensions_appendix_fp, \ - self.newFile(self.directory + '/provisional_extension_appendices' + self.file_suffix) as provisional_extension_appendices_fp, \ - self.newFile(self.directory + '/provisional_extension_appendices_toc' + self.file_suffix) as provisional_extension_appendices_toc_fp, \ - self.newFile(self.directory + '/provisional_extensions_guard_macro' + self.file_suffix) as provisional_extensions_guard_macro_fp: + with self.newFile(self.directory / f"current_extensions_appendix{self.file_suffix}") as current_extensions_appendix_fp, \ + self.newFile(self.directory / f"deprecated_extensions_appendix{self.file_suffix}") as deprecated_extensions_appendix_fp, \ + self.newFile(self.directory / f"current_extension_appendices{self.file_suffix}") as current_extension_appendices_fp, \ + self.newFile(self.directory / f"current_extension_appendices_toc{self.file_suffix}") as current_extension_appendices_toc_fp, \ + self.newFile(self.directory / f"deprecated_extension_appendices{self.file_suffix}") as deprecated_extension_appendices_fp, \ + self.newFile(self.directory / f"deprecated_extension_appendices_toc{self.file_suffix}") as deprecated_extension_appendices_toc_fp, \ + self.newFile(self.directory / f"deprecated_extensions_guard_macro{self.file_suffix}") as deprecated_extensions_guard_macro_fp, \ + self.newFile(self.directory / f"provisional_extensions_appendix{self.file_suffix}") as provisional_extensions_appendix_fp, \ + self.newFile(self.directory / f"provisional_extension_appendices{self.file_suffix}") as provisional_extension_appendices_fp, \ + self.newFile(self.directory / f"provisional_extension_appendices_toc{self.file_suffix}") as provisional_extension_appendices_toc_fp, \ + self.newFile(self.directory / f"provisional_extensions_guard_macro{self.file_suffix}") as provisional_extensions_guard_macro_fp: # Note: there is a hardwired assumption in creating the # include:: directives below that all of these files are located @@ -626,60 +629,80 @@ class ExtensionMetaDocOutputGenerator(OutputGenerator): # This is difficult to change, and it is very unlikely changing # it will be needed. - # Do not include the lengthy '*extension_appendices_toc' indices - # in the Antora site build, since all the extensions are already - # indexed on the right navigation sidebar. - - write('', file=current_extensions_appendix_fp) - write('include::{generated}/meta/deprecated_extensions_guard_macro' + self.file_suffix + '[]', file=current_extensions_appendix_fp) - write('', file=current_extensions_appendix_fp) - write('ifndef::HAS_DEPRECATED_EXTENSIONS[]', file=current_extensions_appendix_fp) - write('[[extension-appendices-list]]', file=current_extensions_appendix_fp) - write('== List of Extensions', file=current_extensions_appendix_fp) - write('endif::HAS_DEPRECATED_EXTENSIONS[]', file=current_extensions_appendix_fp) - write('ifdef::HAS_DEPRECATED_EXTENSIONS[]', file=current_extensions_appendix_fp) - write('[[extension-appendices-list]]', file=current_extensions_appendix_fp) - write('== List of Current Extensions', file=current_extensions_appendix_fp) - write('endif::HAS_DEPRECATED_EXTENSIONS[]', file=current_extensions_appendix_fp) - write('', file=current_extensions_appendix_fp) - write('ifndef::site-gen-antora[]', file=current_extensions_appendix_fp) - write('include::{generated}/meta/current_extension_appendices_toc' + self.file_suffix + '[]', file=current_extensions_appendix_fp) - write('endif::site-gen-antora[]', file=current_extensions_appendix_fp) - write('\n<<<\n', file=current_extensions_appendix_fp) - write('include::{generated}/meta/current_extension_appendices' + self.file_suffix + '[]', file=current_extensions_appendix_fp) - - write('', file=deprecated_extensions_appendix_fp) - write('include::{generated}/meta/deprecated_extensions_guard_macro' + self.file_suffix + '[]', file=deprecated_extensions_appendix_fp) - write('', file=deprecated_extensions_appendix_fp) - write('ifdef::HAS_DEPRECATED_EXTENSIONS[]', file=deprecated_extensions_appendix_fp) - write('[[deprecated-extension-appendices-list]]', file=deprecated_extensions_appendix_fp) - write('== List of Deprecated Extensions', file=deprecated_extensions_appendix_fp) - write('ifndef::site-gen-antora[]', file=deprecated_extensions_appendix_fp) - write('include::{generated}/meta/deprecated_extension_appendices_toc' + self.file_suffix + '[]', file=deprecated_extensions_appendix_fp) - write('endif::site-gen-antora[]', file=deprecated_extensions_appendix_fp) - write('\n<<<\n', file=deprecated_extensions_appendix_fp) - write('include::{generated}/meta/deprecated_extension_appendices' + self.file_suffix + '[]', file=deprecated_extensions_appendix_fp) - write('endif::HAS_DEPRECATED_EXTENSIONS[]', file=deprecated_extensions_appendix_fp) + def write_appendix_header(guard_prefix, + prefix, + file_suffix, + section_title, + fp, + guard = None): + """Write header of an extension appendix section to a file. + The current, deprecated, and provisional section headers + are sufficiently similar to factor this out. + + - guard_prefix - prefix to included guard file name + - prefix - prefix to other filenames and to anchors + - file_suffix - supplied by the APIConventions object + - section_title - markup for title of this section + - fp - file pointer to write to + - guard - asciidoc attribute protecting against multiple + inclusion, or None""" + + if guard is not None: + ifdef_protect = f'ifdef::{guard}[]' + endif_protect = f'endif::{guard}[]' + else: + ifdef_protect = '' + endif_protect = '' + + # Do not include the lengthy '*extension_appendices_toc' indices + # in the Antora site build, since all the extensions are already + # indexed on the right navigation sidebar. + + print('', + f'include::{{generated}}/meta/{guard_prefix}_extensions_guard_macro{file_suffix}[]', + '', + ifdef_protect, + f'[[{prefix}-extension-appendices-list]]', + section_title, + 'ifndef::site-gen-antora[]', + f'include::{{generated}}/meta/{prefix}_extension_appendices_toc{file_suffix}[]', + 'endif::site-gen-antora[]', + '\n<<<\n', + f'include::{{generated}}/meta/{prefix}_extension_appendices{file_suffix}[]', + endif_protect, + file=fp, sep='\n') + + write_appendix_header(guard_prefix = 'deprecated', + prefix = 'current', + file_suffix = self.file_suffix, + section_title = '\ +ifdef::HAS_DEPRECATED_EXTENSIONS[:sectitle: Current Extensions]\n\ +ifndef::HAS_DEPRECATED_EXTENSIONS[:sectitle: Extensions]\n\ +== List of {sectitle}', + fp = current_extensions_appendix_fp, + guard = None) + + write_appendix_header(guard_prefix = 'deprecated', + prefix = 'deprecated', + file_suffix = self.file_suffix, + section_title = '== List of Deprecated Extensions', + fp = deprecated_extensions_appendix_fp, + guard = 'HAS_DEPRECATED_EXTENSIONS') # add include guards to allow multiple includes write('ifndef::DEPRECATED_EXTENSIONS_GUARD_MACRO_INCLUDE_GUARD[]', file=deprecated_extensions_guard_macro_fp) write(':DEPRECATED_EXTENSIONS_GUARD_MACRO_INCLUDE_GUARD:\n', file=deprecated_extensions_guard_macro_fp) + + write_appendix_header(guard_prefix = 'provisional', + prefix = 'provisional', + file_suffix = self.file_suffix, + section_title = '== List of Provisional Extensions', + fp = provisional_extensions_appendix_fp, + guard = 'HAS_PROVISIONAL_EXTENSIONS') + write('ifndef::PROVISIONAL_EXTENSIONS_GUARD_MACRO_INCLUDE_GUARD[]', file=provisional_extensions_guard_macro_fp) write(':PROVISIONAL_EXTENSIONS_GUARD_MACRO_INCLUDE_GUARD:\n', file=provisional_extensions_guard_macro_fp) - write('', file=provisional_extensions_appendix_fp) - write('include::{generated}/meta/provisional_extensions_guard_macro' + self.file_suffix + '[]', file=provisional_extensions_appendix_fp) - write('', file=provisional_extensions_appendix_fp) - write('ifdef::HAS_PROVISIONAL_EXTENSIONS[]', file=provisional_extensions_appendix_fp) - write('[[provisional-extension-appendices-list]]', file=provisional_extensions_appendix_fp) - write('== List of Provisional Extensions', file=provisional_extensions_appendix_fp) - write('ifndef::site-gen-antora[]', file=provisional_extensions_appendix_fp) - write('include::{generated}/meta/provisional_extension_appendices_toc' + self.file_suffix + '[]', file=provisional_extensions_appendix_fp) - write('endif::site-gen-antora[]', file=provisional_extensions_appendix_fp) - write('\n<<<\n', file=provisional_extensions_appendix_fp) - write('include::{generated}/meta/provisional_extension_appendices' + self.file_suffix + '[]', file=provisional_extensions_appendix_fp) - write('endif::HAS_PROVISIONAL_EXTENSIONS[]', file=provisional_extensions_appendix_fp) - # Emit extensions in author ID order sorted_keys = sorted(self.extensions.keys(), key=makeSortKey) for name in sorted_keys: @@ -693,7 +716,8 @@ class ExtensionMetaDocOutputGenerator(OutputGenerator): link = ' * ' + self.conventions.formatExtension(ext.name) - if ext.provisional == 'true': + # If something is provisional and deprecated, it is deprecated. + if ext.provisional == 'true' and ext.deprecationType is None: write(self.conditionalExt(ext.name, include), file=provisional_extension_appendices_fp) write(self.conditionalExt(ext.name, link), file=provisional_extension_appendices_toc_fp) write(self.conditionalExt(ext.name, ':HAS_PROVISIONAL_EXTENSIONS:'), file=provisional_extensions_guard_macro_fp) @@ -747,7 +771,7 @@ class ExtensionMetaDocOutputGenerator(OutputGenerator): specialuse = self.getAttrib(interface, 'specialuse', OPTIONAL) ratified = self.getAttrib(interface, 'ratified', OPTIONAL, '') - filename = self.directory + '/' + name + self.file_suffix + filename = self.directory / f"{name}{self.file_suffix}" extdata = Extension( generator = self, @@ -802,7 +826,6 @@ class ExtensionMetaDocOutputGenerator(OutputGenerator): def getSpecVersion(self, elem, extname, default=None): """Determine the extension revision from the EXTENSION_NAME_SPEC_VERSION enumerant. - This only makes sense for Vulkan. - elem - <extension> element to query - extname - extension name from the <extension> 'name' attribute diff --git a/scripts/genRef.py b/scripts/genRef.py index 9b78fd0d..8e6b884d 100755 --- a/scripts/genRef.py +++ b/scripts/genRef.py @@ -79,10 +79,12 @@ def printCopyrightSourceComments(fp): Writes an asciidoc comment block, which copyrights the source file.""" + # REUSE-IgnoreStart print('// Copyright 2014-2024 The Khronos Group Inc.', file=fp) print('//', file=fp) # This works around constraints of the 'reuse' tool print('// SPDX' + '-License-Identifier: CC-BY-4.0', file=fp) + # REUSE-IgnoreEnd print('', file=fp) diff --git a/scripts/generator.py b/scripts/generator.py index dea2ffa3..3acfd85e 100644 --- a/scripts/generator.py +++ b/scripts/generator.py @@ -65,7 +65,7 @@ def regSortCategoryKey(feature): else: return 0 - if feature.category.upper() in ['ARB', 'KHR', 'OES']: + if feature.category.upper() in ('ARB', 'KHR', 'OES'): return 1 return 2 @@ -860,7 +860,7 @@ class OutputGenerator: """Create a directory, if not already done. Generally called from derived generators creating hierarchies.""" - self.logMsg('diag', 'OutputGenerator::makeDir(' + path + ')') + self.logMsg('diag', 'OutputGenerator::makeDir(', path, ')') if path not in self.madeDirs: # This can get race conditions with multiple writers, see # https://stackoverflow.com/questions/273192/ @@ -919,11 +919,11 @@ class OutputGenerator: # On successfully generating output, move the temporary file to the # target file. if self.genOpts.filename is not None: + directory = Path(self.genOpts.directory) if sys.platform == 'win32': - directory = Path(self.genOpts.directory) if not Path.exists(directory): os.makedirs(directory) - shutil.copy(self.outFile.name, self.genOpts.directory + '/' + self.genOpts.filename) + shutil.copy(self.outFile.name, directory / self.genOpts.filename) os.remove(self.outFile.name) self.genOpts = None diff --git a/scripts/hostsyncgenerator.py b/scripts/hostsyncgenerator.py index 70e0bbf0..0a79e24d 100644 --- a/scripts/hostsyncgenerator.py +++ b/scripts/hostsyncgenerator.py @@ -8,6 +8,7 @@ from generator import OutputGenerator, write from spec_tools.attributes import ExternSyncEntry from spec_tools.validity import ValidityCollection, ValidityEntry from spec_tools.util import getElemName +from pathlib import Path class HostSynchronizationOutputGenerator(OutputGenerator): @@ -41,7 +42,8 @@ class HostSynchronizationOutputGenerator(OutputGenerator): - directory - subdirectory to put file in - basename - base name of the file - contents - contents of the file (Asciidoc boilerplate aside)""" - filename = self.genOpts.directory + '/' + basename + assert self.genOpts + filename = Path(self.genOpts.directory) / basename self.logMsg('diag', '# Generating include file:', filename) with open(filename, 'w', encoding='utf-8') as fp: write(self.genOpts.conventions.warning_comment, file=fp) @@ -57,13 +59,15 @@ class HostSynchronizationOutputGenerator(OutputGenerator): def writeInclude(self): "Generates the asciidoc include files.""" - self.writeBlock('parameters.adoc', + assert self.genOpts + file_suffix = self.genOpts.conventions.file_suffix + self.writeBlock(f'parameters{file_suffix}', 'Externally Synchronized Parameters', self.threadsafety['parameters']) - self.writeBlock('parameterlists.adoc', + self.writeBlock(f'parameterlists{file_suffix}', 'Externally Synchronized Parameter Lists', self.threadsafety['parameterlists']) - self.writeBlock('implicit.adoc', + self.writeBlock(f'implicit{file_suffix}', 'Implicit Externally Synchronized Parameters', self.threadsafety['implicit']) diff --git a/scripts/json_c_generator.py b/scripts/json_c_generator.py index 12e9ae2e..24a57fe9 100644 --- a/scripts/json_c_generator.py +++ b/scripts/json_c_generator.py @@ -242,7 +242,7 @@ class JSONCOutputGenerator(OutputGenerator): def endFeature(self): if self.emit: if self.feature_not_empty: - if self.genOpts.conventions.writeFeature(self.featureExtraProtect, self.genOpts.filename): + if self.genOpts.conventions.writeFeature(self.featureName, self.featureExtraProtect, self.genOpts.filename): for section in self.TYPE_SECTIONS: contents = self.sections[section] diff --git a/scripts/json_generator.py b/scripts/json_generator.py index 9d5b3b6e..269c6b52 100644 --- a/scripts/json_generator.py +++ b/scripts/json_generator.py @@ -554,7 +554,7 @@ class JSONOutputGenerator(OutputGenerator): def endFeature(self): if self.emit: if self.feature_not_empty: - if self.genOpts.conventions.writeFeature(self.featureExtraProtect, self.genOpts.filename): + if self.genOpts.conventions.writeFeature(self.featureName, self.featureExtraProtect, self.genOpts.filename): for section in self.TYPE_SECTIONS: contents = self.sections[section] diff --git a/scripts/json_h_generator.py b/scripts/json_h_generator.py index 8271134f..bd2c04a1 100644 --- a/scripts/json_h_generator.py +++ b/scripts/json_h_generator.py @@ -175,7 +175,7 @@ class JSONHeaderOutputGenerator(OutputGenerator): def endFeature(self): if self.emit: if self.feature_not_empty: - if self.genOpts.conventions.writeFeature(self.featureExtraProtect, self.genOpts.filename): + if self.genOpts.conventions.writeFeature(self.featureName, self.featureExtraProtect, self.genOpts.filename): for section in self.TYPE_SECTIONS: contents = self.sections[section] diff --git a/scripts/json_parser.py b/scripts/json_parser.py index 8ff90804..ba4d4418 100644 --- a/scripts/json_parser.py +++ b/scripts/json_parser.py @@ -574,7 +574,7 @@ class JSONParserGenerator(OutputGenerator): def endFeature(self): if self.emit: if self.feature_not_empty: - if self.genOpts.conventions.writeFeature(self.featureExtraProtect, self.genOpts.filename): + if self.genOpts.conventions.writeFeature(self.featureName, self.featureExtraProtect, self.genOpts.filename): for section in self.TYPE_SECTIONS: contents = self.sections[section] diff --git a/scripts/reg.py b/scripts/reg.py index b8f8af7c..fcc47ac6 100644 --- a/scripts/reg.py +++ b/scripts/reg.py @@ -1154,6 +1154,8 @@ class Registry: # Resolve the type info to the actual type, so we get an accurate read for 'structextends' while alias: typeinfo = self.lookupElementInfo(alias, self.typedict) + if not typeinfo: + raise RuntimeError(f"Missing alias {alias}") alias = typeinfo.elem.get('alias') typecat = typeinfo.elem.get('category') diff --git a/scripts/schema_generator.py b/scripts/schema_generator.py index e2ddb682..818c12a7 100644 --- a/scripts/schema_generator.py +++ b/scripts/schema_generator.py @@ -150,7 +150,7 @@ class SchemaOutputGenerator(OutputGenerator): def endFeature(self): if self.emit: if self.feature_not_empty: - if self.genOpts.conventions.writeFeature(self.featureExtraProtect, self.genOpts.filename): + if self.genOpts.conventions.writeFeature(self.featureName, self.featureExtraProtect, self.genOpts.filename): for section in self.TYPE_SECTIONS: contents = self.sections[section] diff --git a/scripts/spec_tools/algo.py b/scripts/spec_tools/algo.py index e45aaa8e..8af0f0ca 100644 --- a/scripts/spec_tools/algo.py +++ b/scripts/spec_tools/algo.py @@ -83,7 +83,7 @@ def longest_common_prefix(strings): 'abc' """ - assert(len(strings) > 1) + assert len(strings) > 1 a = min(strings) b = max(strings) prefix = [] @@ -127,7 +127,7 @@ def longest_common_token_prefix(strings, delimiter='_'): '' """ - assert(len(strings) > 1) + assert len(strings) > 1 a = min(strings).split(delimiter) b = max(strings).split(delimiter) prefix_tokens = [] diff --git a/scripts/spec_tools/attributes.py b/scripts/spec_tools/attributes.py index 8cbec176..6449d8f3 100644 --- a/scripts/spec_tools/attributes.py +++ b/scripts/spec_tools/attributes.py @@ -64,7 +64,7 @@ class LengthEntry: return self.full_reference def get_human_readable(self, make_param_name=None): - assert(self.other_param_name) + assert self.other_param_name return _human_readable_deref(self.full_reference, make_param_name=make_param_name) def __repr__(self): @@ -97,7 +97,7 @@ class ExternSyncEntry: self.member = self.param_ref_parts[0] def get_human_readable(self, make_param_name=None): - assert(not self.entirely_extern_sync) + assert not self.entirely_extern_sync return _human_readable_deref(self.full_reference, make_param_name=make_param_name) @staticmethod diff --git a/scripts/spec_tools/consistency_tools.py b/scripts/spec_tools/consistency_tools.py index 74365e1f..90b6cb56 100644 --- a/scripts/spec_tools/consistency_tools.py +++ b/scripts/spec_tools/consistency_tools.py @@ -1,6 +1,7 @@ #!/usr/bin/python3 -i # # Copyright (c) 2019 Collabora, Ltd. +# Copyright 2018-2024 The Khronos Group Inc. # # SPDX-License-Identifier: Apache-2.0 # @@ -153,7 +154,16 @@ class XMLChecker: Returns the stripped name and the tag, or the input and None if there was no tag. """ + # Author tag can be suffixed with experimental version + name_no_experimental = re.sub("X[0-9]*$", "", name) + for t in self.tags: + if ( + self.conventions.allows_x_number_suffix + and name_no_experimental.endswith(t) + ): + name = name_no_experimental + if name.endswith(t): name = name[:-(len(t))] if name[-1] == "_": @@ -241,17 +251,23 @@ class XMLChecker: print('xml_consistency/consistency_tools error and warning messages follow.') for entity in entities_with_messages: + print() + print('-------------------') + print('Messages for', entity) + print() messages = self.errors.get(entity) if messages: - print(f'\nError messages for {entity}') for m in messages: - print('ERROR:', m) + print('Error:', m) messages = self.warnings.get(entity) - if messages and self.display_warnings: - print(f'\nWarning messages for {entity}') - for m in messages: - print('WARNING:', m) + if messages: + if self.display_warnings: + for m in messages: + print('Warning:', m) + else: + print('Warnings are not shown - try using --include_warn') + def check_param(self, param): """Check a member of a struct or a param of a function. diff --git a/scripts/spec_tools/conventions.py b/scripts/spec_tools/conventions.py index 50ca75d4..9c583f56 100644 --- a/scripts/spec_tools/conventions.py +++ b/scripts/spec_tools/conventions.py @@ -153,6 +153,11 @@ class ConventionsBase(abc.ABC): return 'code:' @property + def allows_x_number_suffix(self): + """Whether vendor tags can be suffixed with X and a number to mark experimental extensions.""" + return False + + @property @abc.abstractmethod def structtype_member_name(self): """Return name of the structure type member. @@ -213,7 +218,7 @@ class ConventionsBase(abc.ABC): Do not edit these defaults, override self.makeProseList(). """ - assert(serial_comma) # did not implement what we did not need + assert serial_comma # did not implement what we did not need if isinstance(fmt, str): fmt = ProseListFormats.from_string(fmt) @@ -366,7 +371,7 @@ class ConventionsBase(abc.ABC): May override.""" return self.api_prefix + 'EXT_' - def writeFeature(self, featureExtraProtect, filename): + def writeFeature(self, featureName, featureExtraProtect, filename): """Return True if OutputGenerator.endFeature should write this feature. Defaults to always True. diff --git a/scripts/spec_tools/entity_db.py b/scripts/spec_tools/entity_db.py index f79f6267..23456e54 100644 --- a/scripts/spec_tools/entity_db.py +++ b/scripts/spec_tools/entity_db.py @@ -249,7 +249,7 @@ class EntityDatabase(ABC): if alias in self._byEntity: return self.findEntity(alias) - assert(not "Alias without main entry!") + assert not "Alias without main entry!" return None @@ -400,7 +400,7 @@ class EntityDatabase(ABC): alias_set = self._aliasSetsByEntity.get(first_entity_name) if not alias_set: # If this assert fails, we have goofed in addAlias - assert(second_entity_name not in self._aliasSetsByEntity) + assert second_entity_name not in self._aliasSetsByEntity return False @@ -452,7 +452,7 @@ class EntityDatabase(ABC): other_alias_set = self._aliasSetsByEntity.get(entityName) if alias_set and other_alias_set: # If this fails, we need to merge sets and update. - assert(alias_set is other_alias_set) + assert alias_set is other_alias_set if not alias_set: # Try looking by the other name. diff --git a/scripts/spec_tools/macro_checker.py b/scripts/spec_tools/macro_checker.py index fe150341..3a0a4b7a 100644 --- a/scripts/spec_tools/macro_checker.py +++ b/scripts/spec_tools/macro_checker.py @@ -18,7 +18,7 @@ class MacroChecker(object): """ def __init__(self, enabled_messages, entity_db, - macro_checker_file_type, root_path): + macro_checker_file_type, root_path, conventions): """Construct an object that tracks checking one or more files in an API spec. enabled_messages -- a set of MessageId that should be enabled. @@ -26,11 +26,13 @@ class MacroChecker(object): macro_checker_file_type -- Type to instantiate to create the right MacroCheckerFile subclass for this API. root_path -- A Path object for the root of this repository. + conventions -- A ConventionsBase object. """ self.enabled_messages = enabled_messages self.entity_db = entity_db self.macro_checker_file_type = macro_checker_file_type self.root_path = root_path + self.conventions = conventions self.files = [] @@ -47,20 +49,27 @@ class MacroChecker(object): # apiPrefix, followed by some word characters or * as many times as desired, # NOT followed by >> and NOT preceded by one of the characters in that first character class. # (which distinguish "names being used somewhere other than prose"). - self.suspected_missing_macro_re = re.compile( - r'\b(?<![-=:/[\.`+,])(?P<entity_name>{}[\w*]+)\b(?!>>)'.format( - self.entity_db.case_insensitive_name_prefix_pattern) + self.suspected_missing_macro_re = re.compile( fr''' + \b(?<![-=:/[\.`+,]) # NOT preceded by one of these characters + (?P<entity_name>{self.entity_db.case_insensitive_name_prefix_pattern}[\w*]+) # Something that looks like our entity names + \b(?!>>) # NOT followed by >> + ''', re.VERBOSE ) self.heading_command_re = re.compile( - r'=+ (?P<command>{}[\w]+)'.format(self.entity_db.name_prefix) + fr'=+ (?P<command>{self.entity_db.name_prefix}[\w]+)' ) macros_pattern = '|'.join((re.escape(macro) for macro in self.entity_db.macros)) # the "formatting" group is to strip matching */**/_/__ # surrounding an entire macro. - self.macro_re = re.compile( - r'(?P<formatting>\**|_*)(?P<macro>{}):(?P<entity_name>[\w*]+((?P<subscript>[\[][^\]]*[\]]))?)(?P=formatting)'.format(macros_pattern)) + self.macro_re = re.compile(fr''' + (?P<formatting>\**|_*) # opening formatting + (?P<macro>{macros_pattern}): # macro name and colon + (?P<entity_name>[\w*]+(?P<subscript>\[([^\]]*)\])?) + (?P=formatting) # matching trailing formatting + ''', + re.VERBOSE) def haveLinkTarget(self, entity): """Report if we have parsed an API include (or heading) for an entity. diff --git a/scripts/spec_tools/macro_checker_file.py b/scripts/spec_tools/macro_checker_file.py index cd45bf3f..02bd67df 100644 --- a/scripts/spec_tools/macro_checker_file.py +++ b/scripts/spec_tools/macro_checker_file.py @@ -11,6 +11,7 @@ import re from collections import OrderedDict, namedtuple from enum import Enum from inspect import currentframe +from typing import Set from .shared import (AUTO_FIX_STRING, CATEGORIES_WITH_VALIDITY, EXTENSION_CATEGORY, NON_EXISTENT_MACROS, EntityData, @@ -48,7 +49,7 @@ allowed_path_attributes = { # Matches a generated (api or validity) include line. INCLUDE = re.compile( - r'include::(?P<directory_traverse>((../){1,4}|\{generated\}/)(generated/)?)(?P<generated_type>(api|validity))/(?P<category>\w+)/(?P<entity_name>[^./]+).adoc[\[][\]]') + r'include::(?P<directory_traverse>((../){1,4}|\{generated\}/)(generated/)?)(?P<generated_type>(api|validity))/(?P<category>\w+)/(?P<entity_name>[^./]+).adoc[\[][\]]') # noqa # Matches an [[AnchorLikeThis]] ANCHOR = re.compile(r'\[\[(?P<entity_name>[^\]]+)\]\]') @@ -61,7 +62,7 @@ PRECEDING_MEMBER_REFERENCE = re.compile( # Matches something like slink:foo::pname:bar as well as # the under-marked-up slink:foo::bar. MEMBER_REFERENCE = re.compile( - r'\b(?P<first_part>(?P<scope_macro>[fs](text|link)):(?P<scope>[\w*]+))(?P<double_colons>::)(?P<second_part>(?P<member_macro>pname:?)(?P<entity_name>[\w]+))\b' + r'\b(?P<first_part>(?P<scope_macro>[fs](text|link)):(?P<scope>[\w*]+))(?P<double_colons>::)(?P<second_part>(?P<member_macro>pname:?)(?P<entity_name>[\w]+))\b' # noqa ) # Matches if a string ends while a link is still "open". @@ -95,32 +96,6 @@ BRACKETS = re.compile(r'\[(?P<tags>.*)\]') REF_PAGE_ATTRIB = re.compile( r"(?P<key>[a-z]+)='(?P<value>[^'\\]*(?:\\.[^'\\]*)*)'") -# Exceptions to: -# error: Definition of link target {} with macro etext (used for category enums) does not exist. (-Wwrong_macro) -# typically caused by using Vulkan-only enums in Vulkan SC blocks with "etext", or because they -# are suffixed differently. -CHECK_UNRECOGNIZED_ETEXT_EXCEPTIONS = ( - 'VK_COLORSPACE_SRGB_NONLINEAR_KHR', - 'VK_COLOR_SPACE_DCI_P3_LINEAR_EXT', - 'VK_PIPELINE_CACHE_CREATE_READ_ONLY_BIT', - 'VK_STENCIL_FRONT_AND_BACK', - 'VK_STRUCTURE_TYPE_PHYSICAL_DEVICE_SHADER_DRAW_PARAMETER_FEATURES', - 'VK_STRUCTURE_TYPE_PHYSICAL_DEVICE_VARIABLE_POINTER_FEATURES', - 'VK_STRUCTURE_TYPE_SURFACE_CAPABILITIES2_EXT', -) - -# Exceptions to: -# warning: Definition of link target {} with macro ename (used for category enums) does not exist. (-Wbad_enumerant) -# typically caused by Vulkan SC enums not being recognized in Vulkan build -CHECK_UNRECOGNIZED_ENAME_EXCEPTIONS = ( - 'VK_ERROR_INVALID_PIPELINE_CACHE_DATA', - 'VK_ERROR_NO_PIPELINE_MATCH', - 'VK_ERROR_VALIDATION_FAILED', - 'VK_MEMORY_HEAP_SEU_SAFE_BIT', - 'VK_PIPELINE_CACHE_CREATE_READ_ONLY_BIT', - 'VK_PIPELINE_CACHE_CREATE_USE_APPLICATION_STORAGE_BIT', - 'VK_PIPELINE_CACHE_HEADER_VERSION_SAFETY_CRITICAL_ONE', -) class Attrib(Enum): """Attributes of a ref page.""" @@ -221,13 +196,13 @@ def shouldEntityBeText(entity, subscript): return False -class MacroCheckerFile(object): +class MacroCheckerFile: """Object performing processing of a single AsciiDoctor file from a specification. For testing purposes, may also process a string as if it were a file. """ - def __init__(self, checker, filename, enabled_messages, stream_maker): + def __init__(self, checker, filename: str, enabled_messages: Set[MessageId], stream_maker): """Construct a MacroCheckerFile object. Typically called by MacroChecker.processFile or MacroChecker.processString(). @@ -549,8 +524,10 @@ class MacroCheckerFile(object): self.warning(MessageId.EXTENSION, "Seems like this is an extension name that was not linked.", group='entity_name', replacement=self.makeExtensionLink()) else: + macro = data.macro + category = data.category self.warning(MessageId.MISSING_MACRO, - ['Seems like a "{}" macro was omitted for this reference to a known entity in category "{}".'.format(data.macro, data.category), + ['Seems like a "{}" macro was omitted for this reference to a known entity in category "{}".'.format(macro, category), 'Wrap in ` ` to silence this if you do not want a verified macro here.'], group='entity_name', replacement=self.makeMacroMarkup(data.macro)) @@ -571,7 +548,8 @@ class MacroCheckerFile(object): group='entity_name', replacement=self.makeExtensionLink(data.entity)) else: self.warning(MessageId.MISSING_MACRO, - 'Seems like a macro was omitted for this reference to a known entity in category "{}", found by searching case-insensitively.'.format( + 'Seems like a macro was omitted for this reference to a known entity in category "{}"' + ', found by searching case-insensitively.'.format( data.category), replacement=self.makeMacroMarkup(data=data)) @@ -732,6 +710,8 @@ class MacroCheckerFile(object): # Do some validation of pname references. if macro == 'pname': + # there is only a macro if there is a match + assert match # See if there's an immediately-preceding entity preceding = self.line[:match.start()] scope = PRECEDING_MEMBER_REFERENCE.search(preceding) @@ -749,6 +729,10 @@ class MacroCheckerFile(object): # can't check this. pass + def shouldSkipUnrecognizedEntity(self, macro, entity_name): + """Return True if we should not warn about not recognizing a macro invocation for entity_name.""" + return False + def checkRecognizedEntity(self): """Check the current macro:entity match to see if it is recognized. @@ -759,6 +743,7 @@ class MacroCheckerFile(object): """ entity = self.entity macro = self.macro + assert macro if self.checker.findMacroAndEntity(macro, entity) is not None: # We know this macro-entity combo return True @@ -773,11 +758,11 @@ class MacroCheckerFile(object): _pluralize('category', len(possibleCats)), ', '.join(possibleCats))] + if self.shouldSkipUnrecognizedEntity(macro, entity): + return False + data = self.checker.findEntity(entity) if data: - if entity in CHECK_UNRECOGNIZED_ETEXT_EXCEPTIONS: - return False - # We found the goof: incorrect macro msg.append('Apparently matching entity in category {} found.'.format( data.category)) @@ -834,11 +819,9 @@ class MacroCheckerFile(object): # hard to check this. if self.checker.likelyRecognizedEntity(entity): if not self.checkText(): - if entity in CHECK_UNRECOGNIZED_ENAME_EXCEPTIONS: - return False - else: - self.warning(MessageId.BAD_ENUMERANT, msg + - ['Unrecognized ename:{} that we would expect to recognize since it fits the pattern for this API.'.format(entity)], see_also=see_also) + self.warning(MessageId.BAD_ENUMERANT, msg + + ['Unrecognized ename:{} that we would expect to recognize since it fits the pattern for this API.'.format(entity)], + see_also=see_also) else: # This is fine: # it doesn't need to be recognized since it's not linked. @@ -857,8 +840,8 @@ class MacroCheckerFile(object): macro = self.macro entity = self.entity shouldBeText = shouldEntityBeText(entity, self.subscript) - if shouldBeText and not self.macro.endswith( - 'text') and not self.macro == 'code': + assert macro + if shouldBeText and not macro.endswith('text') and not macro == 'code': newMacro = macro[0] + 'text' if self.checker.entity_db.getCategoriesForMacro(newMacro): self.error(MessageId.MISSING_TEXT, @@ -876,7 +859,7 @@ class MacroCheckerFile(object): "No asterisk/leading or trailing underscore/bracket in the entity, so this might be a mistaken use of the 'text' macro {}:".format(macro)] data = self.checker.findEntity(entity) if data: - if entity in CHECK_UNRECOGNIZED_ETEXT_EXCEPTIONS: + if self.shouldSkipUnrecognizedEntity(macro, entity): return False # We found the goof: incorrect macro @@ -1014,7 +997,10 @@ class MacroCheckerFile(object): msg = ["Found reference page markup, but we are already in a refpage block.", "The block before the first message of this type is most likely not closed.", ] # Fake-close the previous ref page, if it's trivial to do so. - if self.getInnermostBlockEntry().block_type == BlockType.REF_PAGE_LIKE: + innermost_block = self.getInnermostBlockEntry() + # self.in_ref_page is true only when the innermost block is something + assert innermost_block + if innermost_block.block_type == BlockType.REF_PAGE_LIKE: msg.append( "Pretending that there was a line with `--` immediately above to close that ref page, for more readable messages.") self.processBlockDelimiter( @@ -1064,9 +1050,9 @@ class MacroCheckerFile(object): else: # TODO suggest fixes here if applicable self.error(MessageId.REFPAGE_NAME, - [ "Found reference page markup, but refpage='{}' type='{}' does not refer to a recognized entity".format( + ["Found reference page markup, but refpage='{}' type='{}' does not refer to a recognized entity".format( text, type_text), - 'If this is intentional, add the entity to EXTRA_DEFINES or EXTRA_REFPAGES in check_spec_links.py.' ], + 'If this is intentional, add the entity to EXTRA_DEFINES or EXTRA_REFPAGES in check_spec_links.py.'], context=context) else: self.error(MessageId.REFPAGE_TAG, @@ -1232,8 +1218,8 @@ class MacroCheckerFile(object): msg = ["Found reference page markup, with an enum value listed: {}".format( referenced_entity)] self.error(MessageId.REFPAGE_XREFS, - msg, - context=context) + msg, + context=context) return if data: @@ -1305,7 +1291,9 @@ class MacroCheckerFile(object): If None, will assume it is the direct caller of self.warning(). """ if not frame: - frame = currentframe().f_back + f = currentframe() + if f: + frame = f.f_back self.diag(MessageType.WARNING, message_id, messageLines, group=group, replacement=replacement, context=context, fix=fix, see_also=see_also, frame=frame) @@ -1332,7 +1320,9 @@ class MacroCheckerFile(object): If None, will assume it is the direct caller of self.error(). """ if not frame: - frame = currentframe().f_back + f = currentframe() + if f: + frame = f.f_back self.diag(MessageType.ERROR, message_id, messageLines, group=group, replacement=replacement, context=context, fix=fix, see_also=see_also, frame=frame) @@ -1375,7 +1365,9 @@ class MacroCheckerFile(object): messageLines.append(AUTO_FIX_STRING) if not frame: - frame = currentframe().f_back + f = currentframe() + if f: + frame = f.f_back if context is None: message = Message(message_id=message_id, message_type=severity, @@ -1456,6 +1448,7 @@ class MacroCheckerFile(object): include_dict -- The include dictionary to update: one of self.apiIncludes or self.validityIncludes. generated_type -- The type of include (e.g. 'api', 'valid', etc). By default, extracted from self.match. """ + assert self.match entity = self.match.group('entity_name') if generated_type is None: generated_type = self.match.group('generated_type') @@ -1557,6 +1550,7 @@ class MacroCheckerFile(object): def handleIncludeMismatchRefPage(self, entity, generated_type): """Report a message about an include not matching its containing ref-page block.""" + assert self.current_ref_page self.warning(MessageId.REFPAGE_MISMATCH, "Found {} include for {}, inside the reference page block of {}".format( generated_type, entity, self.current_ref_page.entity)) @@ -1630,8 +1624,8 @@ class MacroCheckerFile(object): def makeMacroMarkup(self, newMacro=None, newEntity=None, data=None): """Construct appropriate markup for referring to an entity. - Typically constructs macro:entity, but can construct `<<EXTENSION_NAME>>` if the supplied - entity is identified as an extension. + Typically constructs macro:entity, but can construct `<<EXTENSION_NAME>>` + (or equivalent) if the supplied entity is identified as an extension. Arguments: newMacro -- The macro to use. Defaults to data.macro (if available), otherwise self.macro. @@ -1657,14 +1651,14 @@ class MacroCheckerFile(object): def makeExtensionLink(self, newEntity=None): """Create a correctly-formatted link to an extension. - Result takes the form `<<EXTENSION_NAME>>`. + Result takes the form `<<EXTENSION_NAME>>` or whatever the Conventions define. Argument: newEntity -- The extension name to link to. Defaults to self.entity. """ if not newEntity: newEntity = self.entity - return '`<<{}>>`'.format(newEntity) + return self.checker.conventions().formatExtension(newEntity) def computeExpectedRefPageFromInclude(self, entity): """Compute the expected ref page entity based on an include entity name.""" diff --git a/scripts/spec_tools/util.py b/scripts/spec_tools/util.py index e67038a5..b1ac5d2d 100644 --- a/scripts/spec_tools/util.py +++ b/scripts/spec_tools/util.py @@ -1,6 +1,7 @@ """Utility functions not closely tied to other spec_tools types.""" # Copyright (c) 2018-2019 Collabora, Ltd. # Copyright 2013-2024 The Khronos Group Inc. +# # SPDX-License-Identifier: Apache-2.0 diff --git a/scripts/test_check_spec_links.py b/scripts/test_check_spec_links.py index 352b1d56..a4aeef2c 100644 --- a/scripts/test_check_spec_links.py +++ b/scripts/test_check_spec_links.py @@ -160,9 +160,12 @@ def test_misused_text(ckr): def test_extension(ckr): ckr.enabled(set(MessageId)) # Check formatting of extension names: - # the following is the canonical way to refer to an extension - # (link wrapped in backticks) - expected_replacement = '`<<%s>>`' % EXT + # The canonical way to refer to an extension differs depending on the + # conventions object, which makes the expected replacement difficult + # without *access* to the conventions object. + # For XR: '`<<%s>>`' + # For Vulkan: 'apiext:%s' + expected_replacement = 'apiext:%s' % EXT # Extension name mentioned without any markup, should be added assert(loneMsgReplacement(ckr.check('asdf %s asdf' % EXT)) diff --git a/scripts/validitygenerator.py b/scripts/validitygenerator.py index 52bb3a3a..ffe81b3b 100755 --- a/scripts/validitygenerator.py +++ b/scripts/validitygenerator.py @@ -9,6 +9,10 @@ from collections import OrderedDict, namedtuple from functools import reduce from pathlib import Path +from generator import OutputGenerator, write +from spec_tools.attributes import (ExternSyncEntry, LengthEntry, + has_any_optional_in_param, + parse_optional_from_param) from spec_tools.conventions import ProseListFormats as plf from generator import OutputGenerator, write from spec_tools.attributes import ExternSyncEntry, LengthEntry @@ -17,7 +21,7 @@ from spec_tools.util import (findNamedElem, findNamedObject, findTypedElem, from spec_tools.validity import ValidityCollection, ValidityEntry -class UnhandledCaseError(RuntimeError): +class UnhandledCaseError(NotImplementedError): def __init__(self, msg=None): if msg: super().__init__('Got a case in the validity generator that we have not explicitly handled: ' + msg) @@ -58,6 +62,11 @@ def _genericIsDisjoint(a, b): return True +_WCHAR = "wchar_t" +_CHAR = "char" +_CHARACTER_TYPES = {_CHAR, _WCHAR} + + class ValidityOutputGenerator(OutputGenerator): """ValidityOutputGenerator - subclass of OutputGenerator. @@ -342,7 +351,7 @@ class ValidityOutputGenerator(OutputGenerator): def isHandleOptional(self, param, params): # Simple, if it is optional, return true - if param.get('optional') is not None: + if has_any_optional_in_param(param): return True # If no validity is being generated, it usually means that validity is complex and not absolute, so say yes. @@ -360,7 +369,7 @@ class ValidityOutputGenerator(OutputGenerator): if other_param is None: self.logMsg('warn', length.other_param_name, 'is listed as a length for parameter', param, 'but no such parameter exists') - if other_param and other_param.get('optional'): + if other_param and has_any_optional_in_param(other_param): return True return False @@ -394,7 +403,9 @@ class ValidityOutputGenerator(OutputGenerator): # General pre-amble. Check optionality and add stuff. entry = ValidityEntry(anchor=(param_name, 'parameter')) - is_optional = param.get('optional') is not None and param.get('optional').split(',')[0] == 'true' + + optional = parse_optional_from_param(param) + is_optional = optional[0] # This is for a union member, and the valid member is chosen by an enum selection if selector: @@ -412,7 +423,7 @@ class ValidityOutputGenerator(OutputGenerator): return entry if self.paramIsStaticArray(param): - if paramtype != 'char': + if paramtype not in _CHARACTER_TYPES: entry += 'Each element of ' return entry @@ -426,8 +437,7 @@ class ValidityOutputGenerator(OutputGenerator): continue other_param = findNamedElem(params, length.other_param_name) - other_param_optional = (other_param is not None) and ( - other_param.get('optional') is not None) + other_param_optional = has_any_optional_in_param(other_param) if other_param is None or not other_param_optional: # Do not care about not-found params or non-optional params @@ -457,7 +467,7 @@ class ValidityOutputGenerator(OutputGenerator): entry += ' is not `NULL`, ' return entry - if param.get('optional'): + if any(optional): entry += self.makeOptionalPre(param) return entry @@ -491,14 +501,20 @@ class ValidityOutputGenerator(OutputGenerator): else: entry += '{} must: be '.format(self.makeParameterName(param_name)) - if self.paramIsStaticArray(param) and paramtype == 'char': + optional = parse_optional_from_param(param) + if self.paramIsStaticArray(param) and paramtype in _CHARACTER_TYPES: # TODO this is a minor hack to determine if this is a command parameter or a struct member if self.paramIsConst(param) or blockname.startswith(self.conventions.type_prefix): + if paramtype != _CHAR: + raise UnhandledCaseError("input arrays of wchar_t are not yet handled") entry += 'a null-terminated UTF-8 string whose length is less than or equal to ' entry += self.staticArrayLength(param) else: # This is a command's output parameter - entry += 'a character array of length %s ' % self.staticArrayLength(param) + entry += 'a ' + if paramtype == _WCHAR: + entry += "wide " + entry += 'character array of length %s ' % self.staticArrayLength(param) validity += entry return validity @@ -506,8 +522,10 @@ class ValidityOutputGenerator(OutputGenerator): # Arrays. These are hard to get right, apparently lengths = LengthEntry.parse_len_from_param(param) + if lengths is None: + raise RuntimeError("Logic error: decided this was an array but there is no len attribute") - for i, length in enumerate(LengthEntry.parse_len_from_param(param)): + for i, length in enumerate(lengths): if i == 0: # If the first index, make it singular. entry += 'a ' @@ -554,7 +572,7 @@ class ValidityOutputGenerator(OutputGenerator): # An array of void values is a byte array. entry += 'byte' - elif paramtype == 'char': + elif paramtype == _CHAR: # A null terminated array of chars is a string if lengths[-1].null_terminated: entry += 'UTF-8 string' @@ -571,10 +589,9 @@ class ValidityOutputGenerator(OutputGenerator): entry += 'valid ' # Check if the array elements are optional - array_element_optional = param.get('optional') is not None \ - and len(param.get('optional').split(',')) == len(LengthEntry.parse_len_from_param(param)) + 1 \ - and param.get('optional').split(',')[-1] == 'true' - if array_element_optional and self.getTypeCategory(paramtype) != 'bitmask': # bitmask is handled later + array_element_optional = len(optional) == len(lengths) + 1 \ + and optional[-1] + if array_element_optional and self.getTypeCategory(paramtype) != 'bitmask': # bitmask is handled later entry += 'or dlink:' + self.conventions.api_prefix + 'NULL_HANDLE ' entry += typetext @@ -668,7 +685,7 @@ class ValidityOutputGenerator(OutputGenerator): or is_pointer or not self.isStructAlwaysValid(paramtype)) typetext = None - if paramtype in ('void', 'char'): + if paramtype in ('void', _CHAR): # Chars and void are special cases - we call the impl function, # but do not use the typetext. # A null-terminated char array is a string, else it is chars. @@ -175,7 +175,7 @@ branch of the member gitlab server. #define <name>VKSC_API_VERSION_1_0</name> <type>VK_MAKE_API_VERSION</type>(VKSC_API_VARIANT, 1, 0, 0)// Patch version should always be set to 0</type> <type api="vulkan" category="define">// Version of this file -#define <name>VK_HEADER_VERSION</name> 283</type> +#define <name>VK_HEADER_VERSION</name> 284</type> <type api="vulkan" category="define" requires="VK_HEADER_VERSION">// Complete version of this file #define <name>VK_HEADER_VERSION_COMPLETE</name> <type>VK_MAKE_API_VERSION</type>(0, 1, 3, VK_HEADER_VERSION)</type> <type api="vulkansc" category="define">// Version of this file @@ -6258,6 +6258,16 @@ typedef void* <name>MTLSharedEvent_id</name>; <member optional="true" noautovalidity="true"><type>void</type>* <name>pNext</name></member> <member><type>VkBool32</type> <name>attachmentFeedbackLoopDynamicState</name></member> </type> + <type category="struct" name="VkPhysicalDeviceLegacyVertexAttributesFeaturesEXT" structextends="VkPhysicalDeviceFeatures2,VkDeviceCreateInfo"> + <member values="VK_STRUCTURE_TYPE_PHYSICAL_DEVICE_LEGACY_VERTEX_ATTRIBUTES_FEATURES_EXT"><type>VkStructureType</type> <name>sType</name></member> + <member optional="true" noautovalidity="true"><type>void</type>* <name>pNext</name></member> + <member><type>VkBool32</type> <name>legacyVertexAttributes</name></member> + </type> + <type category="struct" name="VkPhysicalDeviceLegacyVertexAttributesPropertiesEXT" structextends="VkPhysicalDeviceProperties2"> + <member values="VK_STRUCTURE_TYPE_PHYSICAL_DEVICE_LEGACY_VERTEX_ATTRIBUTES_PROPERTIES_EXT"><type>VkStructureType</type> <name>sType</name></member> + <member optional="true" noautovalidity="true"><type>void</type>* <name>pNext</name></member> + <member><type>VkBool32</type> <name>nativeUnalignedPerformance</name></member> + </type> <type category="struct" name="VkPhysicalDeviceMutableDescriptorTypeFeaturesEXT" structextends="VkPhysicalDeviceFeatures2,VkDeviceCreateInfo"> <member values="VK_STRUCTURE_TYPE_PHYSICAL_DEVICE_MUTABLE_DESCRIPTOR_TYPE_FEATURES_EXT"><type>VkStructureType</type> <name>sType</name></member> <member optional="true" noautovalidity="true"><type>void</type>* <name>pNext</name></member> @@ -19571,7 +19581,7 @@ typedef void* <name>MTLSharedEvent_id</name>; <type name="VkPhysicalDevice8BitStorageFeaturesKHR"/> </require> </extension> - <extension name="VK_EXT_external_memory_host" number="179" type="device" author="EXT" depends="VK_KHR_external_memory,VK_VERSION_1_1" contact="Daniel Rakos @drakos-amd" supported="vulkan,vulkansc"> + <extension name="VK_EXT_external_memory_host" number="179" type="device" author="EXT" depends="VK_KHR_external_memory,VK_VERSION_1_1" contact="Daniel Rakos @drakos-amd" supported="vulkan,vulkansc" ratified="vulkan"> <require> <enum value="1" name="VK_EXT_EXTERNAL_MEMORY_HOST_SPEC_VERSION"/> <enum value=""VK_EXT_external_memory_host"" name="VK_EXT_EXTERNAL_MEMORY_HOST_EXTENSION_NAME"/> @@ -19972,7 +19982,7 @@ typedef void* <name>MTLSharedEvent_id</name>; <type name="VkPhysicalDeviceVulkanMemoryModelFeaturesKHR"/> </require> </extension> - <extension name="VK_EXT_pci_bus_info" number="213" type="device" author="EXT" depends="VK_KHR_get_physical_device_properties2,VK_VERSION_1_1" contact="Matthaeus G. Chajdas @anteru" supported="vulkan,vulkansc"> + <extension name="VK_EXT_pci_bus_info" number="213" type="device" author="EXT" depends="VK_KHR_get_physical_device_properties2,VK_VERSION_1_1" contact="Matthaeus G. Chajdas @anteru" supported="vulkan,vulkansc" ratified="vulkan"> <require> <enum value="2" name="VK_EXT_PCI_BUS_INFO_SPEC_VERSION"/> <enum value=""VK_EXT_pci_bus_info"" name="VK_EXT_PCI_BUS_INFO_EXTENSION_NAME"/> @@ -20215,7 +20225,7 @@ typedef void* <name>MTLSharedEvent_id</name>; <enum value=""VK_KHR_spirv_1_4"" name="VK_KHR_SPIRV_1_4_EXTENSION_NAME"/> </require> </extension> - <extension name="VK_EXT_memory_budget" number="238" type="device" depends="VK_KHR_get_physical_device_properties2,VK_VERSION_1_1" author="EXT" contact="Jeff Bolz @jeffbolznv" supported="vulkan,vulkansc"> + <extension name="VK_EXT_memory_budget" number="238" type="device" depends="VK_KHR_get_physical_device_properties2,VK_VERSION_1_1" author="EXT" contact="Jeff Bolz @jeffbolznv" supported="vulkan,vulkansc" ratified="vulkan"> <require> <enum value="1" name="VK_EXT_MEMORY_BUDGET_SPEC_VERSION"/> <enum value=""VK_EXT_memory_budget"" name="VK_EXT_MEMORY_BUDGET_EXTENSION_NAME"/> @@ -20397,7 +20407,7 @@ typedef void* <name>MTLSharedEvent_id</name>; <type name="VkPhysicalDeviceFragmentShaderInterlockFeaturesEXT"/> </require> </extension> - <extension name="VK_EXT_ycbcr_image_arrays" number="253" type="device" depends="VK_KHR_sampler_ycbcr_conversion,VK_VERSION_1_1" author="EXT" contact="Piers Daniell @pdaniell-nv" supported="vulkan,vulkansc"> + <extension name="VK_EXT_ycbcr_image_arrays" number="253" type="device" depends="VK_KHR_sampler_ycbcr_conversion,VK_VERSION_1_1" author="EXT" contact="Piers Daniell @pdaniell-nv" supported="vulkan,vulkansc" ratified="vulkan"> <require> <enum value="1" name="VK_EXT_YCBCR_IMAGE_ARRAYS_SPEC_VERSION"/> <enum value=""VK_EXT_ycbcr_image_arrays"" name="VK_EXT_YCBCR_IMAGE_ARRAYS_EXTENSION_NAME"/> @@ -20686,7 +20696,7 @@ typedef void* <name>MTLSharedEvent_id</name>; <command name="vkUnmapMemory2KHR"/> </require> </extension> - <extension name="VK_EXT_map_memory_placed" number="273" type="device" depends="VK_KHR_map_memory2" author="EXT" contact="Faith Ekstrand @gfxstrand" supported="vulkan"> + <extension name="VK_EXT_map_memory_placed" number="273" type="device" depends="VK_KHR_map_memory2" author="EXT" contact="Faith Ekstrand @gfxstrand" supported="vulkan" ratified="vulkan"> <require> <enum value="1" name="VK_EXT_MAP_MEMORY_PLACED_SPEC_VERSION"/> <enum value=""VK_EXT_map_memory_placed"" name="VK_EXT_MAP_MEMORY_PLACED_EXTENSION_NAME"/> @@ -21884,7 +21894,7 @@ typedef void* <name>MTLSharedEvent_id</name>; <type name="VkPipelineViewportDepthClipControlCreateInfoEXT"/> </require> </extension> - <extension name="VK_EXT_primitive_topology_list_restart" number="357" type="device" author="EXT" contact="Shahbaz Youssefi @syoussefi" depends="VK_KHR_get_physical_device_properties2,VK_VERSION_1_1" supported="vulkan" specialuse="glemulation"> + <extension name="VK_EXT_primitive_topology_list_restart" number="357" type="device" author="EXT" contact="Shahbaz Youssefi @syoussefi" depends="VK_KHR_get_physical_device_properties2,VK_VERSION_1_1" supported="vulkan" ratified="vulkan" specialuse="glemulation"> <require> <enum value="1" name="VK_EXT_PRIMITIVE_TOPOLOGY_LIST_RESTART_SPEC_VERSION"/> <enum value=""VK_EXT_primitive_topology_list_restart"" name="VK_EXT_PRIMITIVE_TOPOLOGY_LIST_RESTART_EXTENSION_NAME"/> @@ -23676,10 +23686,14 @@ typedef void* <name>MTLSharedEvent_id</name>; <type name="VkMutableDescriptorTypeCreateInfoEXT"/> </require> </extension> - <extension name="VK_EXT_extension_496" number="496" author="EXT" contact="Mike Blumenkrantz @zmike" type="device" supported="disabled"> + <extension name="VK_EXT_legacy_vertex_attributes" number="496" author="EXT" contact="Mike Blumenkrantz @zmike" type="device" supported="vulkan" depends="VK_EXT_vertex_input_dynamic_state" specialuse="glemulation"> <require> - <enum value="0" name="VK_EXT_EXTENSION_496_SPEC_VERSION"/> - <enum value=""VK_EXT_extension_496"" name="VK_EXT_EXTENSION_496_EXTENSION_NAME"/> + <enum value="1" name="VK_EXT_LEGACY_VERTEX_ATTRIBUTES_SPEC_VERSION"/> + <enum value=""VK_EXT_legacy_vertex_attributes"" name="VK_EXT_LEGACY_VERTEX_ATTRIBUTES_EXTENSION_NAME"/> + <enum offset="0" extends="VkStructureType" name="VK_STRUCTURE_TYPE_PHYSICAL_DEVICE_LEGACY_VERTEX_ATTRIBUTES_FEATURES_EXT"/> + <enum offset="1" extends="VkStructureType" name="VK_STRUCTURE_TYPE_PHYSICAL_DEVICE_LEGACY_VERTEX_ATTRIBUTES_PROPERTIES_EXT"/> + <type name="VkPhysicalDeviceLegacyVertexAttributesFeaturesEXT"/> + <type name="VkPhysicalDeviceLegacyVertexAttributesPropertiesEXT"/> </require> </extension> <extension name="VK_EXT_layer_settings" number="497" author="EXT" contact="Christophe Riccio @christophe" type="instance" supported="vulkan" ratified="vulkan"> @@ -24370,6 +24384,44 @@ typedef void* <name>MTLSharedEvent_id</name>; <enum bitpos="35" extends="VkPipelineCreateFlagBits2KHR" name="VK_PIPELINE_CREATE_2_RESERVED_35_BIT_KHR"/> </require> </extension> + <extension name="VK_EXT_extension_578" number="578" author="EXT" contact="Daniel Story @daniel-story" supported="disabled"> + <require> + <enum value="0" name="VK_EXT_EXTENSION_578_SPEC_VERSION"/> + <enum value=""VK_EXT_extension_578"" name="VK_EXT_EXTENSION_578_EXTENSION_NAME"/> + </require> + </extension> + <extension name="VK_EXT_extension_579" number="579" author="EXT" contact="Daniel Story @daniel-story" supported="disabled"> + <require> + <enum value="0" name="VK_EXT_EXTENSION_579_SPEC_VERSION"/> + <enum value=""VK_EXT_extension_579"" name="VK_EXT_EXTENSION_579_EXTENSION_NAME"/> + <enum bitpos="8" extends="VkShaderCreateFlagBitsEXT" name="VK_SHADER_CREATE_RESERVED_8_BIT_EXT"/> + <enum bitpos="9" extends="VkShaderCreateFlagBitsEXT" name="VK_SHADER_CREATE_RESERVED_9_BIT_EXT"/> + </require> + </extension> + <extension name="VK_EXT_extension_580" number="580" author="EXT" contact="Graeme Leese @gnl21" supported="disabled"> + <require> + <enum value="0" name="VK_EXT_EXTENSION_580_SPEC_VERSION"/> + <enum value=""VK_EXT_extension_580"" name="VK_EXT_EXTENSION_580_EXTENSION_NAME"/> + </require> + </extension> + <extension name="VK_NV_extension_581" number="581" author="NV" contact="Piers Daniell @pdaniell-nv" supported="disabled"> + <require> + <enum value="0" name="VK_NV_EXTENSION_581_SPEC_VERSION"/> + <enum value=""VK_NV_extension_581"" name="VK_NV_EXTENSION_581_EXTENSION_NAME"/> + </require> + </extension> + <extension name="VK_EXT_extension_582" number="582" author="EXT" contact="Eric Werness @ewerness-nv" supported="disabled"> + <require> + <enum value="0" name="VK_EXT_EXTENSION_582_SPEC_VERSION"/> + <enum value=""VK_EXT_extension_582"" name="VK_EXT_EXTENSION_582_EXTENSION_NAME"/> + </require> + </extension> + <extension name="VK_EXT_extension_583" number="583" author="EXT" contact="Jules Blok @jules" supported="disabled"> + <require> + <enum value="0" name="VK_EXT_EXTENSION_583_SPEC_VERSION"/> + <enum value=""VK_EXT_extension_583"" name="VK_EXT_EXTENSION_583_EXTENSION_NAME"/> + </require> + </extension> </extensions> <formats> <format name="VK_FORMAT_R4G4_UNORM_PACK8" class="8-bit" blockSize="1" texelsPerBlock="1" packed="8"> |