From patchwork Thu Oct 26 10:48:41 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Marta Rybczynska X-Patchwork-Id: 32949 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id EACC6C25B70 for ; Thu, 26 Oct 2023 10:51:15 +0000 (UTC) Received: from mail-wm1-f51.google.com (mail-wm1-f51.google.com [209.85.128.51]) by mx.groups.io with SMTP id smtpd.web11.67703.1698317465647099974 for ; Thu, 26 Oct 2023 03:51:06 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=fBTk0BlY; spf=pass (domain: gmail.com, ip: 209.85.128.51, mailfrom: rybczynska@gmail.com) Received: by mail-wm1-f51.google.com with SMTP id 5b1f17b1804b1-40891d38e3fso5621415e9.1 for ; Thu, 26 Oct 2023 03:51:05 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1698317463; x=1698922263; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=hJNcyqdmckNrzeZWTu1sxQBqhGtEIgyyxLLDWPcfrEs=; b=fBTk0BlYxdlVXpWuVmLRd2g4kCYNBB/tHD85YCtgOnZGoeC3cYqi5dDqULCCT6Lvxn sLDDiaPCc/QhmMwJesjnNchet/yFf4P2T2xEzLmJli7p3gvFpT2uX7I38zqbhIM0Xc8w w61RYbendNLFLIgnmUF++AT4VaSsk1qyTPg0f8uXUoJlz5o54pWxDB/jeRswTkqht8Kd TWfLSN7eslQZGUL53YKAyZfaYlowth/v59U0p5BGXqZYPh9RXbRrBxe2l0u9RBWoVHY7 tD8gdjU0XPsit2+juPwGZHxLN+JhVgjN353ZkmhL2+E6Qkoy8BYC1wDiKSOopvMtZAgO vjdg== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1698317463; x=1698922263; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=hJNcyqdmckNrzeZWTu1sxQBqhGtEIgyyxLLDWPcfrEs=; b=u+JJkArWxYfcMmCm6z+Svt2mrVrl0O9yfDRelu9NiDRz7YCzrAz5Yz0wjyG28Khe0V 5GcHSix8dCGacBMMTPF790RjZWlufHPNYedyz3cF+urekHOzcFBxgV0sUum0cayIHyM1 V/BVTNNAZqMuvZQoJBJ6LhGa1O4OKTEPLDmPdITodNaWKJ8Q3kV5tYZycPhI8e6S54Un IkoK1gX8xARlnK9aKWpNn18HHOBMZTnQS1FkugjUz+7+aY6fv9KnjKBnM3ZWAHHbX8uM e70n6OUX4vsGqMarAHgckLGmdCGQGPKe4h6FVQOYHuyYFLnXr5inqyj2bAkPUWdCSSfI DHyA== X-Gm-Message-State: AOJu0YwZrgxhsqYSemMKSSqNuZ00xCt3MOG5tQcmw8fkDy39Tm5LhyI4 1H1lGXpSmwVN3JdCbYfs7eF25gM4Asz78A== X-Google-Smtp-Source: AGHT+IFfL5MWYxLhKae/1DWUbViUdnSlc7CN6Y32k1jIusEct0hPYhx5znpXkO6b9UAzE07Qt7Vlow== X-Received: by 2002:a05:600c:3104:b0:408:3cdf:32c with SMTP id g4-20020a05600c310400b004083cdf032cmr13760738wmo.41.1698317463067; Thu, 26 Oct 2023 03:51:03 -0700 (PDT) Received: from localhost.localdomain ([31.32.81.187]) by smtp.gmail.com with ESMTPSA id f1-20020adff8c1000000b0032da75af3easm13936004wrq.80.2023.10.26.03.51.02 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Thu, 26 Oct 2023 03:51:02 -0700 (PDT) From: Marta Rybczynska X-Google-Original-From: Marta Rybczynska To: openembedded-core@lists.openembedded.org Cc: richard.purdie@linuxfoundation.org, Louis Rannou , Marta Rybczynska Subject: [RFC][OE-core 1/7] create-spdx-3.0: copy 2.2 class Date: Thu, 26 Oct 2023 12:48:41 +0200 Message-ID: <20231026105033.257971-2-marta.rybczynska@syslinbit.com> X-Mailer: git-send-email 2.42.0 In-Reply-To: <20231026105033.257971-1-marta.rybczynska@syslinbit.com> References: <20231026105033.257971-1-marta.rybczynska@syslinbit.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Thu, 26 Oct 2023 10:51:15 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/189712 From: Louis Rannou Initialize the work on SPDX 3 with a copy of the SPDX 2.2. Change default to SPDX 3. Signed-off-by: Louis Rannou Signed-off-by: Marta Rybczynska --- meta/classes/create-spdx-3.0.bbclass | 1158 ++++++++++++++++++++++++++ meta/classes/create-spdx.bbclass | 2 +- 2 files changed, 1159 insertions(+), 1 deletion(-) create mode 100644 meta/classes/create-spdx-3.0.bbclass diff --git a/meta/classes/create-spdx-3.0.bbclass b/meta/classes/create-spdx-3.0.bbclass new file mode 100644 index 0000000000..b0aef80db1 --- /dev/null +++ b/meta/classes/create-spdx-3.0.bbclass @@ -0,0 +1,1158 @@ +# +# Copyright OpenEmbedded Contributors +# +# SPDX-License-Identifier: GPL-2.0-only +# + +DEPLOY_DIR_SPDX ??= "${DEPLOY_DIR}/spdx" + +# The product name that the CVE database uses. Defaults to BPN, but may need to +# be overriden per recipe (for example tiff.bb sets CVE_PRODUCT=libtiff). +CVE_PRODUCT ??= "${BPN}" +CVE_VERSION ??= "${PV}" + +SPDXDIR ??= "${WORKDIR}/spdx" +SPDXDEPLOY = "${SPDXDIR}/deploy" +SPDXWORK = "${SPDXDIR}/work" +SPDXIMAGEWORK = "${SPDXDIR}/image-work" +SPDXSDKWORK = "${SPDXDIR}/sdk-work" +SPDXDEPS = "${SPDXDIR}/deps.json" + +SPDX_TOOL_NAME ??= "oe-spdx-creator" +SPDX_TOOL_VERSION ??= "1.0" + +SPDXRUNTIMEDEPLOY = "${SPDXDIR}/runtime-deploy" + +SPDX_INCLUDE_SOURCES ??= "0" +SPDX_ARCHIVE_SOURCES ??= "0" +SPDX_ARCHIVE_PACKAGED ??= "0" + +SPDX_UUID_NAMESPACE ??= "sbom.openembedded.org" +SPDX_NAMESPACE_PREFIX ??= "http://spdx.org/spdxdoc" +SPDX_PRETTY ??= "0" + +SPDX_LICENSES ??= "${COREBASE}/meta/files/spdx-licenses.json" + +SPDX_CUSTOM_ANNOTATION_VARS ??= "" + +SPDX_ORG ??= "OpenEmbedded ()" +SPDX_SUPPLIER ??= "Organization: ${SPDX_ORG}" +SPDX_SUPPLIER[doc] = "The SPDX PackageSupplier field for SPDX packages created from \ + this recipe. For SPDX documents create using this class during the build, this \ + is the contact information for the person or organization who is doing the \ + build." + +def extract_licenses(filename): + import re + + lic_regex = re.compile(rb'^\W*SPDX-License-Identifier:\s*([ \w\d.()+-]+?)(?:\s+\W*)?$', re.MULTILINE) + + try: + with open(filename, 'rb') as f: + size = min(15000, os.stat(filename).st_size) + txt = f.read(size) + licenses = re.findall(lic_regex, txt) + if licenses: + ascii_licenses = [lic.decode('ascii') for lic in licenses] + return ascii_licenses + except Exception as e: + bb.warn(f"Exception reading {filename}: {e}") + return None + +def get_doc_namespace(d, doc): + import uuid + namespace_uuid = uuid.uuid5(uuid.NAMESPACE_DNS, d.getVar("SPDX_UUID_NAMESPACE")) + return "%s/%s-%s" % (d.getVar("SPDX_NAMESPACE_PREFIX"), doc.name, str(uuid.uuid5(namespace_uuid, doc.name))) + +def create_annotation(d, comment): + from datetime import datetime, timezone + + creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") + annotation = oe.spdx.SPDXAnnotation() + annotation.annotationDate = creation_time + annotation.annotationType = "OTHER" + annotation.annotator = "Tool: %s - %s" % (d.getVar("SPDX_TOOL_NAME"), d.getVar("SPDX_TOOL_VERSION")) + annotation.comment = comment + return annotation + +def recipe_spdx_is_native(d, recipe): + return any(a.annotationType == "OTHER" and + a.annotator == "Tool: %s - %s" % (d.getVar("SPDX_TOOL_NAME"), d.getVar("SPDX_TOOL_VERSION")) and + a.comment == "isNative" for a in recipe.annotations) + +def is_work_shared_spdx(d): + return bb.data.inherits_class('kernel', d) or ('work-shared' in d.getVar('WORKDIR')) + +def get_json_indent(d): + if d.getVar("SPDX_PRETTY") == "1": + return 2 + return None + +python() { + import json + if d.getVar("SPDX_LICENSE_DATA"): + return + + with open(d.getVar("SPDX_LICENSES"), "r") as f: + data = json.load(f) + # Transform the license array to a dictionary + data["licenses"] = {l["licenseId"]: l for l in data["licenses"]} + d.setVar("SPDX_LICENSE_DATA", data) +} + +def convert_license_to_spdx(lic, document, d, existing={}): + from pathlib import Path + import oe.spdx + + license_data = d.getVar("SPDX_LICENSE_DATA") + extracted = {} + + def add_extracted_license(ident, name): + nonlocal document + + if name in extracted: + return + + extracted_info = oe.spdx.SPDXExtractedLicensingInfo() + extracted_info.name = name + extracted_info.licenseId = ident + extracted_info.extractedText = None + + if name == "PD": + # Special-case this. + extracted_info.extractedText = "Software released to the public domain" + else: + # Seach for the license in COMMON_LICENSE_DIR and LICENSE_PATH + for directory in [d.getVar('COMMON_LICENSE_DIR')] + (d.getVar('LICENSE_PATH') or '').split(): + try: + with (Path(directory) / name).open(errors="replace") as f: + extracted_info.extractedText = f.read() + break + except FileNotFoundError: + pass + if extracted_info.extractedText is None: + # If it's not SPDX or PD, then NO_GENERIC_LICENSE must be set + filename = d.getVarFlag('NO_GENERIC_LICENSE', name) + if filename: + filename = d.expand("${S}/" + filename) + with open(filename, errors="replace") as f: + extracted_info.extractedText = f.read() + else: + bb.fatal("Cannot find any text for license %s" % name) + + extracted[name] = extracted_info + document.hasExtractedLicensingInfos.append(extracted_info) + + def convert(l): + if l == "(" or l == ")": + return l + + if l == "&": + return "AND" + + if l == "|": + return "OR" + + if l == "CLOSED": + return "NONE" + + spdx_license = d.getVarFlag("SPDXLICENSEMAP", l) or l + if spdx_license in license_data["licenses"]: + return spdx_license + + try: + spdx_license = existing[l] + except KeyError: + spdx_license = "LicenseRef-" + l + add_extracted_license(spdx_license, l) + + return spdx_license + + lic_split = lic.replace("(", " ( ").replace(")", " ) ").replace("|", " | ").replace("&", " & ").split() + + return ' '.join(convert(l) for l in lic_split) + +def process_sources(d): + pn = d.getVar('PN') + assume_provided = (d.getVar("ASSUME_PROVIDED") or "").split() + if pn in assume_provided: + for p in d.getVar("PROVIDES").split(): + if p != pn: + pn = p + break + + # glibc-locale: do_fetch, do_unpack and do_patch tasks have been deleted, + # so avoid archiving source here. + if pn.startswith('glibc-locale'): + return False + if d.getVar('PN') == "libtool-cross": + return False + if d.getVar('PN') == "libgcc-initial": + return False + if d.getVar('PN') == "shadow-sysroot": + return False + + # We just archive gcc-source for all the gcc related recipes + if d.getVar('BPN') in ['gcc', 'libgcc']: + bb.debug(1, 'spdx: There is bug in scan of %s is, do nothing' % pn) + return False + + return True + + +def add_package_files(d, doc, spdx_pkg, topdir, get_spdxid, get_types, *, archive=None, ignore_dirs=[], ignore_top_level_dirs=[]): + from pathlib import Path + import oe.spdx + import hashlib + + source_date_epoch = d.getVar("SOURCE_DATE_EPOCH") + if source_date_epoch: + source_date_epoch = int(source_date_epoch) + + sha1s = [] + spdx_files = [] + + file_counter = 1 + for subdir, dirs, files in os.walk(topdir): + dirs[:] = [d for d in dirs if d not in ignore_dirs] + if subdir == str(topdir): + dirs[:] = [d for d in dirs if d not in ignore_top_level_dirs] + + for file in files: + filepath = Path(subdir) / file + filename = str(filepath.relative_to(topdir)) + + if not filepath.is_symlink() and filepath.is_file(): + spdx_file = oe.spdx.SPDXFile() + spdx_file.SPDXID = get_spdxid(file_counter) + for t in get_types(filepath): + spdx_file.fileTypes.append(t) + spdx_file.fileName = filename + + if archive is not None: + with filepath.open("rb") as f: + info = archive.gettarinfo(fileobj=f) + info.name = filename + info.uid = 0 + info.gid = 0 + info.uname = "root" + info.gname = "root" + + if source_date_epoch is not None and info.mtime > source_date_epoch: + info.mtime = source_date_epoch + + archive.addfile(info, f) + + sha1 = bb.utils.sha1_file(filepath) + sha1s.append(sha1) + spdx_file.checksums.append(oe.spdx.SPDXChecksum( + algorithm="SHA1", + checksumValue=sha1, + )) + spdx_file.checksums.append(oe.spdx.SPDXChecksum( + algorithm="SHA256", + checksumValue=bb.utils.sha256_file(filepath), + )) + + if "SOURCE" in spdx_file.fileTypes: + extracted_lics = extract_licenses(filepath) + if extracted_lics: + spdx_file.licenseInfoInFiles = extracted_lics + + doc.files.append(spdx_file) + doc.add_relationship(spdx_pkg, "CONTAINS", spdx_file) + spdx_pkg.hasFiles.append(spdx_file.SPDXID) + + spdx_files.append(spdx_file) + + file_counter += 1 + + sha1s.sort() + verifier = hashlib.sha1() + for v in sha1s: + verifier.update(v.encode("utf-8")) + spdx_pkg.packageVerificationCode.packageVerificationCodeValue = verifier.hexdigest() + + return spdx_files + + +def add_package_sources_from_debug(d, package_doc, spdx_package, package, package_files, sources): + from pathlib import Path + import hashlib + import oe.packagedata + import oe.spdx + + debug_search_paths = [ + Path(d.getVar('PKGD')), + Path(d.getVar('STAGING_DIR_TARGET')), + Path(d.getVar('STAGING_DIR_NATIVE')), + Path(d.getVar('STAGING_KERNEL_DIR')), + ] + + pkg_data = oe.packagedata.read_subpkgdata_extended(package, d) + + if pkg_data is None: + return + + for file_path, file_data in pkg_data["files_info"].items(): + if not "debugsrc" in file_data: + continue + + for pkg_file in package_files: + if file_path.lstrip("/") == pkg_file.fileName.lstrip("/"): + break + else: + bb.fatal("No package file found for %s in %s; SPDX found: %s" % (str(file_path), package, + " ".join(p.fileName for p in package_files))) + continue + + for debugsrc in file_data["debugsrc"]: + ref_id = "NOASSERTION" + for search in debug_search_paths: + if debugsrc.startswith("/usr/src/kernel"): + debugsrc_path = search / debugsrc.replace('/usr/src/kernel/', '') + else: + debugsrc_path = search / debugsrc.lstrip("/") + if not debugsrc_path.exists(): + continue + + file_sha256 = bb.utils.sha256_file(debugsrc_path) + + if file_sha256 in sources: + source_file = sources[file_sha256] + + doc_ref = package_doc.find_external_document_ref(source_file.doc.documentNamespace) + if doc_ref is None: + doc_ref = oe.spdx.SPDXExternalDocumentRef() + doc_ref.externalDocumentId = "DocumentRef-dependency-" + source_file.doc.name + doc_ref.spdxDocument = source_file.doc.documentNamespace + doc_ref.checksum.algorithm = "SHA1" + doc_ref.checksum.checksumValue = source_file.doc_sha1 + package_doc.externalDocumentRefs.append(doc_ref) + + ref_id = "%s:%s" % (doc_ref.externalDocumentId, source_file.file.SPDXID) + else: + bb.debug(1, "Debug source %s with SHA256 %s not found in any dependency" % (str(debugsrc_path), file_sha256)) + break + else: + bb.debug(1, "Debug source %s not found" % debugsrc) + + package_doc.add_relationship(pkg_file, "GENERATED_FROM", ref_id, comment=debugsrc) + +add_package_sources_from_debug[vardepsexclude] += "STAGING_KERNEL_DIR" + +def collect_dep_recipes(d, doc, spdx_recipe): + import json + from pathlib import Path + import oe.sbom + import oe.spdx + + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + spdx_deps_file = Path(d.getVar("SPDXDEPS")) + package_archs = d.getVar("SSTATE_ARCHS").split() + package_archs.reverse() + + dep_recipes = [] + + with spdx_deps_file.open("r") as f: + deps = json.load(f) + + for dep_pn, dep_hashfn in deps: + dep_recipe_path = oe.sbom.doc_find_by_hashfn(deploy_dir_spdx, package_archs, "recipe-" + dep_pn, dep_hashfn) + if not dep_recipe_path: + bb.fatal("Cannot find any SPDX file for recipe %s, %s" % (dep_pn, dep_hashfn)) + + spdx_dep_doc, spdx_dep_sha1 = oe.sbom.read_doc(dep_recipe_path) + + for pkg in spdx_dep_doc.packages: + if pkg.name == dep_pn: + spdx_dep_recipe = pkg + break + else: + continue + + dep_recipes.append(oe.sbom.DepRecipe(spdx_dep_doc, spdx_dep_sha1, spdx_dep_recipe)) + + dep_recipe_ref = oe.spdx.SPDXExternalDocumentRef() + dep_recipe_ref.externalDocumentId = "DocumentRef-dependency-" + spdx_dep_doc.name + dep_recipe_ref.spdxDocument = spdx_dep_doc.documentNamespace + dep_recipe_ref.checksum.algorithm = "SHA1" + dep_recipe_ref.checksum.checksumValue = spdx_dep_sha1 + + doc.externalDocumentRefs.append(dep_recipe_ref) + + doc.add_relationship( + "%s:%s" % (dep_recipe_ref.externalDocumentId, spdx_dep_recipe.SPDXID), + "BUILD_DEPENDENCY_OF", + spdx_recipe + ) + + return dep_recipes + +collect_dep_recipes[vardepsexclude] = "SSTATE_ARCHS" + +def collect_dep_sources(d, dep_recipes): + import oe.sbom + + sources = {} + for dep in dep_recipes: + # Don't collect sources from native recipes as they + # match non-native sources also. + if recipe_spdx_is_native(d, dep.recipe): + continue + recipe_files = set(dep.recipe.hasFiles) + + for spdx_file in dep.doc.files: + if spdx_file.SPDXID not in recipe_files: + continue + + if "SOURCE" in spdx_file.fileTypes: + for checksum in spdx_file.checksums: + if checksum.algorithm == "SHA256": + sources[checksum.checksumValue] = oe.sbom.DepSource(dep.doc, dep.doc_sha1, dep.recipe, spdx_file) + break + + return sources + +def add_download_packages(d, doc, recipe): + import os.path + from bb.fetch2 import decodeurl, CHECKSUM_LIST + import bb.process + import oe.spdx + import oe.sbom + + for download_idx, src_uri in enumerate(d.getVar('SRC_URI').split()): + f = bb.fetch2.FetchData(src_uri, d) + + for name in f.names: + package = oe.spdx.SPDXPackage() + package.name = "%s-source-%d" % (d.getVar("PN"), download_idx + 1) + package.SPDXID = oe.sbom.get_download_spdxid(d, download_idx + 1) + + if f.type == "file": + continue + + uri = f.type + proto = getattr(f, "proto", None) + if proto is not None: + uri = uri + "+" + proto + uri = uri + "://" + f.host + f.path + + if f.method.supports_srcrev(): + uri = uri + "@" + f.revisions[name] + + if f.method.supports_checksum(f): + for checksum_id in CHECKSUM_LIST: + if checksum_id.upper() not in oe.spdx.SPDXPackage.ALLOWED_CHECKSUMS: + continue + + expected_checksum = getattr(f, "%s_expected" % checksum_id) + if expected_checksum is None: + continue + + c = oe.spdx.SPDXChecksum() + c.algorithm = checksum_id.upper() + c.checksumValue = expected_checksum + package.checksums.append(c) + + package.downloadLocation = uri + doc.packages.append(package) + doc.add_relationship(doc, "DESCRIBES", package) + # In the future, we might be able to do more fancy dependencies, + # but this should be sufficient for now + doc.add_relationship(package, "BUILD_DEPENDENCY_OF", recipe) + +def collect_direct_deps(d, dep_task): + current_task = "do_" + d.getVar("BB_CURRENTTASK") + pn = d.getVar("PN") + + taskdepdata = d.getVar("BB_TASKDEPDATA", False) + + for this_dep in taskdepdata.values(): + if this_dep[0] == pn and this_dep[1] == current_task: + break + else: + bb.fatal(f"Unable to find this {pn}:{current_task} in taskdepdata") + + deps = set() + for dep_name in this_dep[3]: + dep_data = taskdepdata[dep_name] + if dep_data[1] == dep_task and dep_data[0] != pn: + deps.add((dep_data[0], dep_data[7])) + + return sorted(deps) + +collect_direct_deps[vardepsexclude] += "BB_TASKDEPDATA" +collect_direct_deps[vardeps] += "DEPENDS" + +python do_collect_spdx_deps() { + # This task calculates the build time dependencies of the recipe, and is + # required because while a task can deptask on itself, those dependencies + # do not show up in BB_TASKDEPDATA. To work around that, this task does the + # deptask on do_create_spdx and writes out the dependencies it finds, then + # do_create_spdx reads in the found dependencies when writing the actual + # SPDX document + import json + from pathlib import Path + + spdx_deps_file = Path(d.getVar("SPDXDEPS")) + + deps = collect_direct_deps(d, "do_create_spdx") + + with spdx_deps_file.open("w") as f: + json.dump(deps, f) +} +# NOTE: depending on do_unpack is a hack that is necessary to get it's dependencies for archive the source +addtask do_collect_spdx_deps after do_unpack +do_collect_spdx_deps[depends] += "${PATCHDEPENDENCY}" +do_collect_spdx_deps[deptask] = "do_create_spdx" +do_collect_spdx_deps[dirs] = "${SPDXDIR}" + +python do_create_spdx() { + from datetime import datetime, timezone + import oe.sbom + import oe.spdx + import uuid + from pathlib import Path + from contextlib import contextmanager + import oe.cve_check + + @contextmanager + def optional_tarfile(name, guard, mode="w"): + import tarfile + import bb.compress.zstd + + num_threads = int(d.getVar("BB_NUMBER_THREADS")) + + if guard: + name.parent.mkdir(parents=True, exist_ok=True) + with bb.compress.zstd.open(name, mode=mode + "b", num_threads=num_threads) as f: + with tarfile.open(fileobj=f, mode=mode + "|") as tf: + yield tf + else: + yield None + + + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + spdx_workdir = Path(d.getVar("SPDXWORK")) + include_sources = d.getVar("SPDX_INCLUDE_SOURCES") == "1" + archive_sources = d.getVar("SPDX_ARCHIVE_SOURCES") == "1" + archive_packaged = d.getVar("SPDX_ARCHIVE_PACKAGED") == "1" + pkg_arch = d.getVar("SSTATE_PKGARCH") + + creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") + + doc = oe.spdx.SPDXDocument() + + doc.name = "recipe-" + d.getVar("PN") + doc.documentNamespace = get_doc_namespace(d, doc) + doc.creationInfo.created = creation_time + doc.creationInfo.comment = "This document was created by analyzing recipe files during the build." + doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] + doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") + doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) + doc.creationInfo.creators.append("Person: N/A ()") + + recipe = oe.spdx.SPDXPackage() + recipe.name = d.getVar("PN") + recipe.versionInfo = d.getVar("PV") + recipe.SPDXID = oe.sbom.get_recipe_spdxid(d) + recipe.supplier = d.getVar("SPDX_SUPPLIER") + if bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d): + recipe.annotations.append(create_annotation(d, "isNative")) + + homepage = d.getVar("HOMEPAGE") + if homepage: + recipe.homepage = homepage + + license = d.getVar("LICENSE") + if license: + recipe.licenseDeclared = convert_license_to_spdx(license, doc, d) + + summary = d.getVar("SUMMARY") + if summary: + recipe.summary = summary + + description = d.getVar("DESCRIPTION") + if description: + recipe.description = description + + if d.getVar("SPDX_CUSTOM_ANNOTATION_VARS"): + for var in d.getVar('SPDX_CUSTOM_ANNOTATION_VARS').split(): + recipe.annotations.append(create_annotation(d, var + "=" + d.getVar(var))) + + # Some CVEs may be patched during the build process without incrementing the version number, + # so querying for CVEs based on the CPE id can lead to false positives. To account for this, + # save the CVEs fixed by patches to source information field in the SPDX. + patched_cves = oe.cve_check.get_patched_cves(d) + patched_cves = list(patched_cves) + patched_cves = ' '.join(patched_cves) + if patched_cves: + recipe.sourceInfo = "CVEs fixed: " + patched_cves + + cpe_ids = oe.cve_check.get_cpe_ids(d.getVar("CVE_PRODUCT"), d.getVar("CVE_VERSION")) + if cpe_ids: + for cpe_id in cpe_ids: + cpe = oe.spdx.SPDXExternalReference() + cpe.referenceCategory = "SECURITY" + cpe.referenceType = "http://spdx.org/rdf/references/cpe23Type" + cpe.referenceLocator = cpe_id + recipe.externalRefs.append(cpe) + + doc.packages.append(recipe) + doc.add_relationship(doc, "DESCRIBES", recipe) + + add_download_packages(d, doc, recipe) + + if process_sources(d) and include_sources: + recipe_archive = deploy_dir_spdx / "recipes" / (doc.name + ".tar.zst") + with optional_tarfile(recipe_archive, archive_sources) as archive: + spdx_get_src(d) + + add_package_files( + d, + doc, + recipe, + spdx_workdir, + lambda file_counter: "SPDXRef-SourceFile-%s-%d" % (d.getVar("PN"), file_counter), + lambda filepath: ["SOURCE"], + ignore_dirs=[".git"], + ignore_top_level_dirs=["temp"], + archive=archive, + ) + + if archive is not None: + recipe.packageFileName = str(recipe_archive.name) + + dep_recipes = collect_dep_recipes(d, doc, recipe) + + doc_sha1 = oe.sbom.write_doc(d, doc, pkg_arch, "recipes", indent=get_json_indent(d)) + dep_recipes.append(oe.sbom.DepRecipe(doc, doc_sha1, recipe)) + + recipe_ref = oe.spdx.SPDXExternalDocumentRef() + recipe_ref.externalDocumentId = "DocumentRef-recipe-" + recipe.name + recipe_ref.spdxDocument = doc.documentNamespace + recipe_ref.checksum.algorithm = "SHA1" + recipe_ref.checksum.checksumValue = doc_sha1 + + sources = collect_dep_sources(d, dep_recipes) + found_licenses = {license.name:recipe_ref.externalDocumentId + ":" + license.licenseId for license in doc.hasExtractedLicensingInfos} + + if not recipe_spdx_is_native(d, recipe): + bb.build.exec_func("read_subpackage_metadata", d) + + pkgdest = Path(d.getVar("PKGDEST")) + for package in d.getVar("PACKAGES").split(): + if not oe.packagedata.packaged(package, d): + continue + + package_doc = oe.spdx.SPDXDocument() + pkg_name = d.getVar("PKG:%s" % package) or package + package_doc.name = pkg_name + package_doc.documentNamespace = get_doc_namespace(d, package_doc) + package_doc.creationInfo.created = creation_time + package_doc.creationInfo.comment = "This document was created by analyzing packages created during the build." + package_doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] + package_doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") + package_doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) + package_doc.creationInfo.creators.append("Person: N/A ()") + package_doc.externalDocumentRefs.append(recipe_ref) + + package_license = d.getVar("LICENSE:%s" % package) or d.getVar("LICENSE") + + spdx_package = oe.spdx.SPDXPackage() + + spdx_package.SPDXID = oe.sbom.get_package_spdxid(pkg_name) + spdx_package.name = pkg_name + spdx_package.versionInfo = d.getVar("PV") + spdx_package.licenseDeclared = convert_license_to_spdx(package_license, package_doc, d, found_licenses) + spdx_package.supplier = d.getVar("SPDX_SUPPLIER") + + package_doc.packages.append(spdx_package) + + package_doc.add_relationship(spdx_package, "GENERATED_FROM", "%s:%s" % (recipe_ref.externalDocumentId, recipe.SPDXID)) + package_doc.add_relationship(package_doc, "DESCRIBES", spdx_package) + + package_archive = deploy_dir_spdx / "packages" / (package_doc.name + ".tar.zst") + with optional_tarfile(package_archive, archive_packaged) as archive: + package_files = add_package_files( + d, + package_doc, + spdx_package, + pkgdest / package, + lambda file_counter: oe.sbom.get_packaged_file_spdxid(pkg_name, file_counter), + lambda filepath: ["BINARY"], + ignore_top_level_dirs=['CONTROL', 'DEBIAN'], + archive=archive, + ) + + if archive is not None: + spdx_package.packageFileName = str(package_archive.name) + + add_package_sources_from_debug(d, package_doc, spdx_package, package, package_files, sources) + + oe.sbom.write_doc(d, package_doc, pkg_arch, "packages", indent=get_json_indent(d)) +} +do_create_spdx[vardepsexclude] += "BB_NUMBER_THREADS" +# NOTE: depending on do_unpack is a hack that is necessary to get it's dependencies for archive the source +addtask do_create_spdx after do_package do_packagedata do_unpack do_collect_spdx_deps before do_populate_sdk do_build do_rm_work + +SSTATETASKS += "do_create_spdx" +do_create_spdx[sstate-inputdirs] = "${SPDXDEPLOY}" +do_create_spdx[sstate-outputdirs] = "${DEPLOY_DIR_SPDX}" + +python do_create_spdx_setscene () { + sstate_setscene(d) +} +addtask do_create_spdx_setscene + +do_create_spdx[dirs] = "${SPDXWORK}" +do_create_spdx[cleandirs] = "${SPDXDEPLOY} ${SPDXWORK}" +do_create_spdx[depends] += "${PATCHDEPENDENCY}" + +def collect_package_providers(d): + from pathlib import Path + import oe.sbom + import oe.spdx + import json + + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + + providers = {} + + deps = collect_direct_deps(d, "do_create_spdx") + deps.append((d.getVar("PN"), d.getVar("BB_HASHFILENAME"))) + + for dep_pn, dep_hashfn in deps: + localdata = d + recipe_data = oe.packagedata.read_pkgdata(dep_pn, localdata) + if not recipe_data: + localdata = bb.data.createCopy(d) + localdata.setVar("PKGDATA_DIR", "${PKGDATA_DIR_SDK}") + recipe_data = oe.packagedata.read_pkgdata(dep_pn, localdata) + + for pkg in recipe_data.get("PACKAGES", "").split(): + + pkg_data = oe.packagedata.read_subpkgdata_dict(pkg, localdata) + rprovides = set(n for n, _ in bb.utils.explode_dep_versions2(pkg_data.get("RPROVIDES", "")).items()) + rprovides.add(pkg) + + if "PKG" in pkg_data: + pkg = pkg_data["PKG"] + rprovides.add(pkg) + + for r in rprovides: + providers[r] = (pkg, dep_hashfn) + + return providers + +collect_package_providers[vardepsexclude] += "BB_TASKDEPDATA" + +python do_create_runtime_spdx() { + from datetime import datetime, timezone + import oe.sbom + import oe.spdx + import oe.packagedata + from pathlib import Path + + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + spdx_deploy = Path(d.getVar("SPDXRUNTIMEDEPLOY")) + is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d) + + creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") + + providers = collect_package_providers(d) + pkg_arch = d.getVar("SSTATE_PKGARCH") + package_archs = d.getVar("SSTATE_ARCHS").split() + package_archs.reverse() + + if not is_native: + bb.build.exec_func("read_subpackage_metadata", d) + + dep_package_cache = {} + + pkgdest = Path(d.getVar("PKGDEST")) + for package in d.getVar("PACKAGES").split(): + localdata = bb.data.createCopy(d) + pkg_name = d.getVar("PKG:%s" % package) or package + localdata.setVar("PKG", pkg_name) + localdata.setVar('OVERRIDES', d.getVar("OVERRIDES", False) + ":" + package) + + if not oe.packagedata.packaged(package, localdata): + continue + + pkg_spdx_path = oe.sbom.doc_path(deploy_dir_spdx, pkg_name, pkg_arch, "packages") + + package_doc, package_doc_sha1 = oe.sbom.read_doc(pkg_spdx_path) + + for p in package_doc.packages: + if p.name == pkg_name: + spdx_package = p + break + else: + bb.fatal("Package '%s' not found in %s" % (pkg_name, pkg_spdx_path)) + + runtime_doc = oe.spdx.SPDXDocument() + runtime_doc.name = "runtime-" + pkg_name + runtime_doc.documentNamespace = get_doc_namespace(localdata, runtime_doc) + runtime_doc.creationInfo.created = creation_time + runtime_doc.creationInfo.comment = "This document was created by analyzing package runtime dependencies." + runtime_doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] + runtime_doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") + runtime_doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) + runtime_doc.creationInfo.creators.append("Person: N/A ()") + + package_ref = oe.spdx.SPDXExternalDocumentRef() + package_ref.externalDocumentId = "DocumentRef-package-" + package + package_ref.spdxDocument = package_doc.documentNamespace + package_ref.checksum.algorithm = "SHA1" + package_ref.checksum.checksumValue = package_doc_sha1 + + runtime_doc.externalDocumentRefs.append(package_ref) + + runtime_doc.add_relationship( + runtime_doc.SPDXID, + "AMENDS", + "%s:%s" % (package_ref.externalDocumentId, package_doc.SPDXID) + ) + + deps = bb.utils.explode_dep_versions2(localdata.getVar("RDEPENDS") or "") + seen_deps = set() + for dep, _ in deps.items(): + if dep in seen_deps: + continue + + if dep not in providers: + continue + + (dep, dep_hashfn) = providers[dep] + + if not oe.packagedata.packaged(dep, localdata): + continue + + dep_pkg_data = oe.packagedata.read_subpkgdata_dict(dep, d) + dep_pkg = dep_pkg_data["PKG"] + + if dep in dep_package_cache: + (dep_spdx_package, dep_package_ref) = dep_package_cache[dep] + else: + dep_path = oe.sbom.doc_find_by_hashfn(deploy_dir_spdx, package_archs, dep_pkg, dep_hashfn) + if not dep_path: + bb.fatal("No SPDX file found for package %s, %s" % (dep_pkg, dep_hashfn)) + + spdx_dep_doc, spdx_dep_sha1 = oe.sbom.read_doc(dep_path) + + for pkg in spdx_dep_doc.packages: + if pkg.name == dep_pkg: + dep_spdx_package = pkg + break + else: + bb.fatal("Package '%s' not found in %s" % (dep_pkg, dep_path)) + + dep_package_ref = oe.spdx.SPDXExternalDocumentRef() + dep_package_ref.externalDocumentId = "DocumentRef-runtime-dependency-" + spdx_dep_doc.name + dep_package_ref.spdxDocument = spdx_dep_doc.documentNamespace + dep_package_ref.checksum.algorithm = "SHA1" + dep_package_ref.checksum.checksumValue = spdx_dep_sha1 + + dep_package_cache[dep] = (dep_spdx_package, dep_package_ref) + + runtime_doc.externalDocumentRefs.append(dep_package_ref) + + runtime_doc.add_relationship( + "%s:%s" % (dep_package_ref.externalDocumentId, dep_spdx_package.SPDXID), + "RUNTIME_DEPENDENCY_OF", + "%s:%s" % (package_ref.externalDocumentId, spdx_package.SPDXID) + ) + seen_deps.add(dep) + + oe.sbom.write_doc(d, runtime_doc, pkg_arch, "runtime", spdx_deploy, indent=get_json_indent(d)) +} + +do_create_runtime_spdx[vardepsexclude] += "OVERRIDES SSTATE_ARCHS" + +addtask do_create_runtime_spdx after do_create_spdx before do_build do_rm_work +SSTATETASKS += "do_create_runtime_spdx" +do_create_runtime_spdx[sstate-inputdirs] = "${SPDXRUNTIMEDEPLOY}" +do_create_runtime_spdx[sstate-outputdirs] = "${DEPLOY_DIR_SPDX}" + +python do_create_runtime_spdx_setscene () { + sstate_setscene(d) +} +addtask do_create_runtime_spdx_setscene + +do_create_runtime_spdx[dirs] = "${SPDXRUNTIMEDEPLOY}" +do_create_runtime_spdx[cleandirs] = "${SPDXRUNTIMEDEPLOY}" +do_create_runtime_spdx[rdeptask] = "do_create_spdx" + +def spdx_get_src(d): + """ + save patched source of the recipe in SPDX_WORKDIR. + """ + import shutil + spdx_workdir = d.getVar('SPDXWORK') + spdx_sysroot_native = d.getVar('STAGING_DIR_NATIVE') + pn = d.getVar('PN') + + workdir = d.getVar("WORKDIR") + + try: + # The kernel class functions require it to be on work-shared, so we dont change WORKDIR + if not is_work_shared_spdx(d): + # Change the WORKDIR to make do_unpack do_patch run in another dir. + d.setVar('WORKDIR', spdx_workdir) + # Restore the original path to recipe's native sysroot (it's relative to WORKDIR). + d.setVar('STAGING_DIR_NATIVE', spdx_sysroot_native) + + # The changed 'WORKDIR' also caused 'B' changed, create dir 'B' for the + # possibly requiring of the following tasks (such as some recipes's + # do_patch required 'B' existed). + bb.utils.mkdirhier(d.getVar('B')) + + bb.build.exec_func('do_unpack', d) + # Copy source of kernel to spdx_workdir + if is_work_shared_spdx(d): + share_src = d.getVar('WORKDIR') + d.setVar('WORKDIR', spdx_workdir) + d.setVar('STAGING_DIR_NATIVE', spdx_sysroot_native) + src_dir = spdx_workdir + "/" + d.getVar('PN')+ "-" + d.getVar('PV') + "-" + d.getVar('PR') + bb.utils.mkdirhier(src_dir) + if bb.data.inherits_class('kernel',d): + share_src = d.getVar('STAGING_KERNEL_DIR') + cmd_copy_share = "cp -rf " + share_src + "/* " + src_dir + "/" + cmd_copy_shared_res = os.popen(cmd_copy_share).read() + bb.note("cmd_copy_shared_result = " + cmd_copy_shared_res) + + git_path = src_dir + "/.git" + if os.path.exists(git_path): + shutils.rmtree(git_path) + + # Make sure gcc and kernel sources are patched only once + if not (d.getVar('SRC_URI') == "" or is_work_shared_spdx(d)): + bb.build.exec_func('do_patch', d) + + # Some userland has no source. + if not os.path.exists( spdx_workdir ): + bb.utils.mkdirhier(spdx_workdir) + finally: + d.setVar("WORKDIR", workdir) + +spdx_get_src[vardepsexclude] += "STAGING_KERNEL_DIR" + +do_rootfs[recrdeptask] += "do_create_spdx do_create_runtime_spdx" +do_rootfs[cleandirs] += "${SPDXIMAGEWORK}" + +ROOTFS_POSTUNINSTALL_COMMAND =+ "image_combine_spdx" + +do_populate_sdk[recrdeptask] += "do_create_spdx do_create_runtime_spdx" +do_populate_sdk[cleandirs] += "${SPDXSDKWORK}" +POPULATE_SDK_POST_HOST_COMMAND:append:task-populate-sdk = " sdk_host_combine_spdx" +POPULATE_SDK_POST_TARGET_COMMAND:append:task-populate-sdk = " sdk_target_combine_spdx" + +python image_combine_spdx() { + import os + import oe.sbom + from pathlib import Path + from oe.rootfs import image_list_installed_packages + + image_name = d.getVar("IMAGE_NAME") + image_link_name = d.getVar("IMAGE_LINK_NAME") + imgdeploydir = Path(d.getVar("IMGDEPLOYDIR")) + img_spdxid = oe.sbom.get_image_spdxid(image_name) + packages = image_list_installed_packages(d) + + combine_spdx(d, image_name, imgdeploydir, img_spdxid, packages, Path(d.getVar("SPDXIMAGEWORK"))) + + def make_image_link(target_path, suffix): + if image_link_name: + link = imgdeploydir / (image_link_name + suffix) + if link != target_path: + link.symlink_to(os.path.relpath(target_path, link.parent)) + + spdx_tar_path = imgdeploydir / (image_name + ".spdx.tar.zst") + make_image_link(spdx_tar_path, ".spdx.tar.zst") +} + +python sdk_host_combine_spdx() { + sdk_combine_spdx(d, "host") +} + +python sdk_target_combine_spdx() { + sdk_combine_spdx(d, "target") +} + +def sdk_combine_spdx(d, sdk_type): + import oe.sbom + from pathlib import Path + from oe.sdk import sdk_list_installed_packages + + sdk_name = d.getVar("TOOLCHAIN_OUTPUTNAME") + "-" + sdk_type + sdk_deploydir = Path(d.getVar("SDKDEPLOYDIR")) + sdk_spdxid = oe.sbom.get_sdk_spdxid(sdk_name) + sdk_packages = sdk_list_installed_packages(d, sdk_type == "target") + combine_spdx(d, sdk_name, sdk_deploydir, sdk_spdxid, sdk_packages, Path(d.getVar('SPDXSDKWORK'))) + +def combine_spdx(d, rootfs_name, rootfs_deploydir, rootfs_spdxid, packages, spdx_workdir): + import os + import oe.spdx + import oe.sbom + import io + import json + from datetime import timezone, datetime + from pathlib import Path + import tarfile + import bb.compress.zstd + + providers = collect_package_providers(d) + package_archs = d.getVar("SSTATE_ARCHS").split() + package_archs.reverse() + + creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + source_date_epoch = d.getVar("SOURCE_DATE_EPOCH") + + doc = oe.spdx.SPDXDocument() + doc.name = rootfs_name + doc.documentNamespace = get_doc_namespace(d, doc) + doc.creationInfo.created = creation_time + doc.creationInfo.comment = "This document was created by analyzing the source of the Yocto recipe during the build." + doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] + doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") + doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) + doc.creationInfo.creators.append("Person: N/A ()") + + image = oe.spdx.SPDXPackage() + image.name = d.getVar("PN") + image.versionInfo = d.getVar("PV") + image.SPDXID = rootfs_spdxid + image.supplier = d.getVar("SPDX_SUPPLIER") + + doc.packages.append(image) + + for name in sorted(packages.keys()): + if name not in providers: + bb.fatal("Unable to find SPDX provider for '%s'" % name) + + pkg_name, pkg_hashfn = providers[name] + + pkg_spdx_path = oe.sbom.doc_find_by_hashfn(deploy_dir_spdx, package_archs, pkg_name, pkg_hashfn) + if not pkg_spdx_path: + bb.fatal("No SPDX file found for package %s, %s" % (pkg_name, pkg_hashfn)) + + pkg_doc, pkg_doc_sha1 = oe.sbom.read_doc(pkg_spdx_path) + + for p in pkg_doc.packages: + if p.name == name: + pkg_ref = oe.spdx.SPDXExternalDocumentRef() + pkg_ref.externalDocumentId = "DocumentRef-%s" % pkg_doc.name + pkg_ref.spdxDocument = pkg_doc.documentNamespace + pkg_ref.checksum.algorithm = "SHA1" + pkg_ref.checksum.checksumValue = pkg_doc_sha1 + + doc.externalDocumentRefs.append(pkg_ref) + doc.add_relationship(image, "CONTAINS", "%s:%s" % (pkg_ref.externalDocumentId, p.SPDXID)) + break + else: + bb.fatal("Unable to find package with name '%s' in SPDX file %s" % (name, pkg_spdx_path)) + + runtime_spdx_path = oe.sbom.doc_find_by_hashfn(deploy_dir_spdx, package_archs, "runtime-" + name, pkg_hashfn) + if not runtime_spdx_path: + bb.fatal("No runtime SPDX document found for %s, %s" % (name, pkg_hashfn)) + + runtime_doc, runtime_doc_sha1 = oe.sbom.read_doc(runtime_spdx_path) + + runtime_ref = oe.spdx.SPDXExternalDocumentRef() + runtime_ref.externalDocumentId = "DocumentRef-%s" % runtime_doc.name + runtime_ref.spdxDocument = runtime_doc.documentNamespace + runtime_ref.checksum.algorithm = "SHA1" + runtime_ref.checksum.checksumValue = runtime_doc_sha1 + + # "OTHER" isn't ideal here, but I can't find a relationship that makes sense + doc.externalDocumentRefs.append(runtime_ref) + doc.add_relationship( + image, + "OTHER", + "%s:%s" % (runtime_ref.externalDocumentId, runtime_doc.SPDXID), + comment="Runtime dependencies for %s" % name + ) + + image_spdx_path = spdx_workdir / (rootfs_name + ".spdx.json") + + with image_spdx_path.open("wb") as f: + doc.to_json(f, sort_keys=True, indent=get_json_indent(d)) + + num_threads = int(d.getVar("BB_NUMBER_THREADS")) + + visited_docs = set() + + index = {"documents": []} + + spdx_tar_path = rootfs_deploydir / (rootfs_name + ".spdx.tar.zst") + with bb.compress.zstd.open(spdx_tar_path, "w", num_threads=num_threads) as f: + with tarfile.open(fileobj=f, mode="w|") as tar: + def collect_spdx_document(path): + nonlocal tar + nonlocal deploy_dir_spdx + nonlocal source_date_epoch + nonlocal index + + if path in visited_docs: + return + + visited_docs.add(path) + + with path.open("rb") as f: + doc, sha1 = oe.sbom.read_doc(f) + f.seek(0) + + if doc.documentNamespace in visited_docs: + return + + bb.note("Adding SPDX document %s" % path) + visited_docs.add(doc.documentNamespace) + info = tar.gettarinfo(fileobj=f) + + info.name = doc.name + ".spdx.json" + info.uid = 0 + info.gid = 0 + info.uname = "root" + info.gname = "root" + + if source_date_epoch is not None and info.mtime > int(source_date_epoch): + info.mtime = int(source_date_epoch) + + tar.addfile(info, f) + + index["documents"].append({ + "filename": info.name, + "documentNamespace": doc.documentNamespace, + "sha1": sha1, + }) + + for ref in doc.externalDocumentRefs: + ref_path = oe.sbom.doc_find_by_namespace(deploy_dir_spdx, package_archs, ref.spdxDocument) + if not ref_path: + bb.fatal("Cannot find any SPDX file for document %s" % ref.spdxDocument) + collect_spdx_document(ref_path) + + collect_spdx_document(image_spdx_path) + + index["documents"].sort(key=lambda x: x["filename"]) + + index_str = io.BytesIO(json.dumps( + index, + sort_keys=True, + indent=get_json_indent(d), + ).encode("utf-8")) + + info = tarfile.TarInfo() + info.name = "index.json" + info.size = len(index_str.getvalue()) + info.uid = 0 + info.gid = 0 + info.uname = "root" + info.gname = "root" + + tar.addfile(info, fileobj=index_str) + +combine_spdx[vardepsexclude] += "BB_NUMBER_THREADS SSTATE_ARCHS" diff --git a/meta/classes/create-spdx.bbclass b/meta/classes/create-spdx.bbclass index 19c6c0ff0b..b604973ae0 100644 --- a/meta/classes/create-spdx.bbclass +++ b/meta/classes/create-spdx.bbclass @@ -5,4 +5,4 @@ # # Include this class when you don't care what version of SPDX you get; it will # be updated to the latest stable version that is supported -inherit create-spdx-2.2 +inherit create-spdx-3.0 From patchwork Thu Oct 26 10:48:42 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Marta Rybczynska X-Patchwork-Id: 32947 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id DA96CC25B48 for ; Thu, 26 Oct 2023 10:51:15 +0000 (UTC) Received: from mail-lf1-f49.google.com (mail-lf1-f49.google.com [209.85.167.49]) by mx.groups.io with SMTP id smtpd.web10.197282.1698317470445536598 for ; Thu, 26 Oct 2023 03:51:10 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=AhecsJuA; spf=pass (domain: gmail.com, ip: 209.85.167.49, mailfrom: rybczynska@gmail.com) Received: by mail-lf1-f49.google.com with SMTP id 2adb3069b0e04-507a3b8b113so1048523e87.0 for ; Thu, 26 Oct 2023 03:51:10 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1698317468; x=1698922268; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=F1weNQUYpAyG6P5aWvJBQH7jJdUNYaPzWCR4tj3G6Cs=; b=AhecsJuAJozVmBFkGSZnykHeSZLOgOWST3m9XwL2sLnHw6O6x7bZhmcidaU9hmrm5f nWDQlg5acbpmUm3aLsDrKyi0pleOkWg+WS4hsaqyY2bMio85sK7rBAu4zjo39G+7twhd RRXDvH0MTDjy/VXoQC/GFTm6KjblRTS3zqodkViAXb2cYwd3Q1jpDMqsiyRRadhGBVAi sN7Bk6IYWHEhQH/UUUT5wsmhpDQew+STqo2vWGmTo+yx7vmId+6eimLprP5XBYjBGobe 3crzqyMoZ1/iNjPJQH2ah0OxOw4wQKRjHTv68bKk7Nz92tApEzwbuiUiEdyVLCflWYZR 2hug== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1698317468; x=1698922268; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=F1weNQUYpAyG6P5aWvJBQH7jJdUNYaPzWCR4tj3G6Cs=; b=AEUJYLxU/eZ9913O82aJRkrWSPp1mbN/2FDocJXOJN12z/W5s6Rt9jeVzJomMvA2J+ R88982UrXo6BDnvRo7hfqL358XI+fytgzrJRmeU33R/Y41CqJGBBU3QtyclQEHDg5iHL 8gIGd1yBSrW45aJix+Knm8EtqVMMsoaq1/10mlHBc/w81TR2swOe08VE/HphyRu/UGQj cRHngD5wlpCfwxzSc9IOVmrN5yVpHDJiSZlqhLizxN3Hg73kvOlQb7yWj6eZulyf9Ep8 vftp14+kYwkNNZDfx6/T/R7R8OgSUTdOBJaiCFF9OE06p2L0JT4czRUjdVlj884KTG/h LKSw== X-Gm-Message-State: AOJu0YzpU53B7qabzUk8/8f12LGpdcZhCpluG80ng0XpJnTZ0bUWYZ2S q6ZJUXwpzEo34dScaAGcoR1SLT5p7txPQQ== X-Google-Smtp-Source: AGHT+IEuKKTlyrz78N7H4j3urpoJfYnV5H2JC1DFtgZm48BmH9mJBu0r6V3QGzosAdtP0E4EVOmwHw== X-Received: by 2002:a05:6512:314e:b0:507:9a8c:a8fe with SMTP id s14-20020a056512314e00b005079a8ca8femr11554359lfi.53.1698317467691; Thu, 26 Oct 2023 03:51:07 -0700 (PDT) Received: from localhost.localdomain ([31.32.81.187]) by smtp.gmail.com with ESMTPSA id f1-20020adff8c1000000b0032da75af3easm13936004wrq.80.2023.10.26.03.51.06 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Thu, 26 Oct 2023 03:51:07 -0700 (PDT) From: Marta Rybczynska X-Google-Original-From: Marta Rybczynska To: openembedded-core@lists.openembedded.org Cc: richard.purdie@linuxfoundation.org, Louis Rannou Subject: [RFC][OE-core 2/7] oe/spdx: extend spdx.py objects Date: Thu, 26 Oct 2023 12:48:42 +0200 Message-ID: <20231026105033.257971-3-marta.rybczynska@syslinbit.com> X-Mailer: git-send-email 2.42.0 In-Reply-To: <20231026105033.257971-1-marta.rybczynska@syslinbit.com> References: <20231026105033.257971-1-marta.rybczynska@syslinbit.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Thu, 26 Oct 2023 10:51:15 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/189713 From: Louis Rannou Extend objects used to build the spdx scheme: - add support for inheritance - hide all attributes starting by _spdx - add methods to list properties and item pairs - improve the serializer to match the spdx3 scheme Signed-off-by: Louis Rannou --- meta/lib/oe/sbom.py | 2 +- meta/lib/oe/spdx.py | 30 +++++++++++++++++++++++------- 2 files changed, 24 insertions(+), 8 deletions(-) diff --git a/meta/lib/oe/sbom.py b/meta/lib/oe/sbom.py index fd4b6895d8..824839378a 100644 --- a/meta/lib/oe/sbom.py +++ b/meta/lib/oe/sbom.py @@ -77,7 +77,7 @@ def write_doc(d, spdx_doc, arch, subdir, spdx_deploy=None, indent=None): dest = doc_path(spdx_deploy, spdx_doc.name, arch, subdir) dest.parent.mkdir(exist_ok=True, parents=True) with dest.open("wb") as f: - doc_sha1 = spdx_doc.to_json(f, sort_keys=True, indent=indent) + doc_sha1 = spdx_doc.to_json(f, sort_keys=False, indent=indent) l = _doc_path_by_namespace(spdx_deploy, arch, spdx_doc.documentNamespace) l.parent.mkdir(exist_ok=True, parents=True) diff --git a/meta/lib/oe/spdx.py b/meta/lib/oe/spdx.py index 7aaf2af5ed..97b9e011ad 100644 --- a/meta/lib/oe/spdx.py +++ b/meta/lib/oe/spdx.py @@ -145,9 +145,13 @@ class MetaSPDXObject(type): def __new__(mcls, name, bases, attrs): attrs["_properties"] = {} - for key in attrs.keys(): - if isinstance(attrs[key], _Property): - prop = attrs[key] + at = {} + for basecls in bases: + at.update(basecls._properties) + at.update(attrs) + for key in at.keys(): + if isinstance(at[key], _Property): + prop = at[key] attrs["_properties"][key] = prop prop.set_property(attrs, key) @@ -166,15 +170,27 @@ class SPDXObject(metaclass=MetaSPDXObject): if name in d: self._spdx[name] = prop.init(d[name]) - def serializer(self): - return self._spdx - def __setattr__(self, name, value): - if name in self._properties or name == "_spdx": + # All attributes must be in _properties or are hidden variables which + # must be prefixed with _spdx + if name in self._properties or name[:len("_spdx")] == "_spdx": super().__setattr__(name, value) return raise KeyError("%r is not a valid SPDX property" % name) + def properties(self): + return self._properties.keys() + + def items(self): + return self._properties.items() + + def serializer(self, rootElement): + main = {"type": self.__class__.__name__[len("SPDX3"):]} + for (key, value) in self._spdx.items(): + if key[0] == '_': + key = key[1:] + main.update({key: value}) + return main # # These are the SPDX objects implemented from the spec. The *only* properties # that can be added to these objects are ones directly specified in the SPDX From patchwork Thu Oct 26 10:48:43 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Marta Rybczynska X-Patchwork-Id: 32948 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id EAA2AC25B6B for ; Thu, 26 Oct 2023 10:51:15 +0000 (UTC) Received: from mail-wr1-f54.google.com (mail-wr1-f54.google.com [209.85.221.54]) by mx.groups.io with SMTP id smtpd.web11.67706.1698317474827101844 for ; Thu, 26 Oct 2023 03:51:15 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=AUL62sdt; spf=pass (domain: gmail.com, ip: 209.85.221.54, mailfrom: rybczynska@gmail.com) Received: by mail-wr1-f54.google.com with SMTP id ffacd0b85a97d-32caaa1c493so517655f8f.3 for ; Thu, 26 Oct 2023 03:51:14 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1698317473; x=1698922273; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=35/TOKOZbRE3hYp6PBH2WHz7dIEN2p2OxUNNlV5al8g=; b=AUL62sdt6CmiNGiq8f4wJgkTt/MgG2LrN0bY7JsY6JBQRGON7FOylfT8CVM7Izb/F8 CWquf6VEKe3oCxlsiNgEE9JTegme4HrTKDAmVrs3OYj8hVPdOC/Tw/eQ0I8LFvs0MMbA GZyK2Is1PhEv9KOj/+zRptgRA/vkkF2pFHxTCaTJF/uVIAI6E8/nK5wGPcU8wqo6Jc5N 2hoWHUqnsVOC8VH0RBYYW/Gtf7i5zQbv8bnTP1MKA0xn83u4uW0wTVvYLp8I6iOhlf5O l2R7nooa/xklLoQ2wXgsG11d1siTCtPkcNaR+iMNbGA3X6mysZYeYxq52XXzrqY3rdVO Zb/g== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1698317473; x=1698922273; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=35/TOKOZbRE3hYp6PBH2WHz7dIEN2p2OxUNNlV5al8g=; b=LnpKFVvEHS50vpOK0Nic2X/ndp+JQCeC5kdrGAngRbY88/4LgwpGMDiktNWFbMCKZi H4O4WwI640T+4svk5S3qfUNhPkkgVj1ujSwdCKznyOTawaebg18UuXLJUoCnZTPOA1N1 af/kopx+ewH4YYqsgF7gQwKBTdOL2BcI1YhP9LBTctF5FFbvJJBezjbs/4IHAXDaFsD1 9vJ7Q3UWhdQioflkZTyFwT0OqBfi+oTckdIDqvS2pVX/w7W7jJDv9cE3nS4vKOw1Yg9m HBWFT1bXFv7y8nVpxHVNSiZvp0+Gm2PZUZxXSBy5vqcQ5+KrCMzzcVAD3vLNIuTz9JkB /BNA== X-Gm-Message-State: AOJu0Ywl5/dRJ1a6X7vkRBDSA29IVDignMIAFvnq7axqQb/AFm1ABqti 3QmuMdIVoTT0RN29KQjfNUZCNsqyxoE3Uw== X-Google-Smtp-Source: AGHT+IH33ATktxahjScTdxhnVniAUEtWr2bmmrTsUgUrbHWiM6P1fSDxgqvMmkqH2FMDpRoE7oVgiQ== X-Received: by 2002:adf:f74f:0:b0:32d:90f7:ce50 with SMTP id z15-20020adff74f000000b0032d90f7ce50mr13235139wrp.17.1698317472565; Thu, 26 Oct 2023 03:51:12 -0700 (PDT) Received: from localhost.localdomain ([31.32.81.187]) by smtp.gmail.com with ESMTPSA id f1-20020adff8c1000000b0032da75af3easm13936004wrq.80.2023.10.26.03.51.11 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Thu, 26 Oct 2023 03:51:11 -0700 (PDT) From: Marta Rybczynska X-Google-Original-From: Marta Rybczynska To: openembedded-core@lists.openembedded.org Cc: richard.purdie@linuxfoundation.org, Louis Rannou , Marta Rybczynska , Samantha Jalabert Subject: [RFC][OE-core 3/7] oe/sbom: change the write_doc to prepare for spdx3 Date: Thu, 26 Oct 2023 12:48:43 +0200 Message-ID: <20231026105033.257971-4-marta.rybczynska@syslinbit.com> X-Mailer: git-send-email 2.42.0 In-Reply-To: <20231026105033.257971-1-marta.rybczynska@syslinbit.com> References: <20231026105033.257971-1-marta.rybczynska@syslinbit.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Thu, 26 Oct 2023 10:51:15 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/189714 From: Louis Rannou This changes the prototype of write_doc as the SPDX3 documentation does not specify yet which is the root element. Signed-off-by: Louis Rannou Signed-off-by: Marta Rybczynska Signed-off-by: Samantha Jalabert --- meta/lib/oe/sbom.py | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/meta/lib/oe/sbom.py b/meta/lib/oe/sbom.py index 824839378a..28db9cf719 100644 --- a/meta/lib/oe/sbom.py +++ b/meta/lib/oe/sbom.py @@ -68,7 +68,8 @@ def doc_path(spdx_deploy, doc_name, arch, subdir): return spdx_deploy / arch / subdir / (doc_name + ".spdx.json") -def write_doc(d, spdx_doc, arch, subdir, spdx_deploy=None, indent=None): +# WARNING: Changed for SPDX3 +def write_doc(d, spdx_graph, spdx_doc, arch, subdir, spdx_deploy=None, indent=None): from pathlib import Path if spdx_deploy is None: @@ -77,7 +78,7 @@ def write_doc(d, spdx_doc, arch, subdir, spdx_deploy=None, indent=None): dest = doc_path(spdx_deploy, spdx_doc.name, arch, subdir) dest.parent.mkdir(exist_ok=True, parents=True) with dest.open("wb") as f: - doc_sha1 = spdx_doc.to_json(f, sort_keys=False, indent=indent) + doc_sha1 = spdx_graph.to_json(f, sort_keys=False, indent=indent) l = _doc_path_by_namespace(spdx_deploy, arch, spdx_doc.documentNamespace) l.parent.mkdir(exist_ok=True, parents=True) From patchwork Thu Oct 26 10:48:44 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Marta Rybczynska X-Patchwork-Id: 32951 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id D8F47C25B48 for ; Thu, 26 Oct 2023 10:51:25 +0000 (UTC) Received: from mail-lf1-f52.google.com (mail-lf1-f52.google.com [209.85.167.52]) by mx.groups.io with SMTP id smtpd.web11.67709.1698317479172251947 for ; Thu, 26 Oct 2023 03:51:19 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=mVXDZ5J5; spf=pass (domain: gmail.com, ip: 209.85.167.52, mailfrom: rybczynska@gmail.com) Received: by mail-lf1-f52.google.com with SMTP id 2adb3069b0e04-507c91582fdso1046315e87.2 for ; Thu, 26 Oct 2023 03:51:18 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1698317477; x=1698922277; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=oEfDBb5tfWAPYiyQkg3PX49ZHxG6/xXRilGcSDJBHzM=; b=mVXDZ5J59J2RlPOGi72IE5yOfhh+oqsySu68AEJ7sxtwerJesTJBGTsDCbbgK0wyCq VhObnBgLjXlJJMAZI6agaRl7MO5Wz61zNm3capijgKUwc5T8tJZI/kGvGBBqsiC1xKn7 eoDs3vpidhGVpCqpmPUahQ6Hh6lgUniQzJGFa0s5yToGMokacdoZ9pIzuBFlYDrgbl8w BJya/IEiCUdBJVHwtArEVJpDvetfu7fhAHnwIYxWxFfi5SgLvY5lx2e9HnLVDSsVZn90 GDlj9SXJAJ1W5A2HL4qsVIwcHKXgw3wEiDpu41UY47dRmimrfudjKF1rprbFo2ZwN9eH QsUA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1698317477; x=1698922277; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=oEfDBb5tfWAPYiyQkg3PX49ZHxG6/xXRilGcSDJBHzM=; b=f3FnzeIAvxfL8s5cGnuDTzhuJNDok7ceYu6/8lTfrKqd4NQrpna/9p/iyevwDFLYj9 OLFsk+JDeiC6Qda5lRUYK4+4f3mt9Y4EhaizlljYsD6p759E4dA40AA37uvRrP7hHpS5 YYkvMIDmqXY0pTrAy+aDLJTx2+PpX0fXqgHo8EsfMrkpGemWRWuqnAe6mvmEUhKGByod KA0/kBdI03mn+3Zz5qreYh7cpD/54BGb0MQH59ppdBJY2iyrLMN2FEkSRs3nSsbzPMLo kDsN+FDeni1nxFOYteM9wJroMX4jHkNC1bk1X1o7teqpUgs0KZp0b0uNo1H6AwypxWwO 9bGg== X-Gm-Message-State: AOJu0YwkakpJFp4FsKfASF/yYYvuxItPd9WDBMTUV/SWFsD5BNi8svap x2JCtk0lRjI6sqX2s28TdEYgzD1bfxzF2A== X-Google-Smtp-Source: AGHT+IFQs8RKGwI0x/amxmA2JVkWJBGJox5TZ+Sokj3wHEbfl0fP7e9XHF9ChJu1aBl7LcOrN4Odzw== X-Received: by 2002:ac2:4adb:0:b0:4f9:54f0:b6db with SMTP id m27-20020ac24adb000000b004f954f0b6dbmr12032067lfp.13.1698317476483; Thu, 26 Oct 2023 03:51:16 -0700 (PDT) Received: from localhost.localdomain ([31.32.81.187]) by smtp.gmail.com with ESMTPSA id f1-20020adff8c1000000b0032da75af3easm13936004wrq.80.2023.10.26.03.51.15 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Thu, 26 Oct 2023 03:51:15 -0700 (PDT) From: Marta Rybczynska X-Google-Original-From: Marta Rybczynska To: openembedded-core@lists.openembedded.org Cc: richard.purdie@linuxfoundation.org, Louis Rannou , Samantha Jalabert Subject: [RFC][OE-core 4/7] create-spdx-3.0: SPDX3 objects as classes Date: Thu, 26 Oct 2023 12:48:44 +0200 Message-ID: <20231026105033.257971-5-marta.rybczynska@syslinbit.com> X-Mailer: git-send-email 2.42.0 In-Reply-To: <20231026105033.257971-1-marta.rybczynska@syslinbit.com> References: <20231026105033.257971-1-marta.rybczynska@syslinbit.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Thu, 26 Oct 2023 10:51:25 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/189715 From: Louis Rannou Create SPDX3 objects that classes as they are described in the SPDX3 model. Signed-off-by: Louis Rannou Signed-off-by: Samantha Jalabert --- meta/lib/oe/spdx3.py | 385 +++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 385 insertions(+) create mode 100644 meta/lib/oe/spdx3.py diff --git a/meta/lib/oe/spdx3.py b/meta/lib/oe/spdx3.py new file mode 100644 index 0000000000..ecbe999258 --- /dev/null +++ b/meta/lib/oe/spdx3.py @@ -0,0 +1,385 @@ +# +# Copyright OpenEmbedded Contributors +# +# SPDX-License-Identifier: GPL-2.0-only +# + +# +# This library is intended to set the data types for the SPDX3 specification. It +# is not intended to encode any particular OE specific behaviors, see the +# sbom.py for that. +# + +from oe.spdx import _String, _StringList, _Object, _ObjectList +from oe.spdx import SPDXObject + +import json +import hashlib + +class SPDX3Tool(SPDXObject): + pass + +class SPDX3Agent(SPDXObject): + pass + +# +# Profile: Core - Enumerations +# +SPDX3HashAlgorithm = [ + "blake2b256", + "blake2b384", + "blake2b512", + "blake3", + "crystalsKyber", + "crystalsDilithium", + "falcon", + "md2", + "md4", + "md5", + "md6", + "other", + "sha1", + "sha224", + "sha256", + "sha3_224", + "sha3_256", + "sha3_384", + "sha3_512", + "sha384", + "sha512", + "spdxPvcSha1", + "spdxPvcSha256", + "sphincsPlus" +] + +# +# Profile: Core - Datatypes +# + +class SPDX3IntegrityMethod(SPDXObject): + comment = _String() + +class SPDX3Hash(SPDX3IntegrityMethod): + hashValue = _String() + algorithm = _String() + +# +# Profile: Core - Classes +# +class SPDX3CreationInfo(SPDXObject): + specVersion = _String(default="3.0.0") + created = _String() + createdBy = _ObjectList(SPDX3Agent) + profile = _StringList(default=["core", "software"]) # TODO: not in creationInfo in spec + createdUsing = _ObjectList(SPDX3Tool) + dataLicense = _String(default="CC0-1.0") + + def serializer(self): + """ + Serialize a creationInfo element. + createdBy and createdUsing are only stored with their spdxId. + other attributes are ordinary serialized + """ + main = {"type": self.__class__.__name__[len("SPDX3"):], + "createdBy": []} + + main["createdBy"] = [c.spdxId for c in self._spdx["createdBy"]] + if "createdUsing" in self._spdx and len(self._spdx["createdUsing"]): + main["createdUsing"] = [c.spdxId for c in self._spdx["createdUsing"]] + + for (key, value) in self._spdx.items(): + if not key in ["createdBy", "createdUsing"]: + main.update({key: value}) + return main + +class SPDX3ExternalMap(SPDXObject): + externalId = _String() + verifiedUsing = _ObjectList(SPDX3IntegrityMethod) + definingDocument = _String() + +class SPDX3Element(SPDXObject): + spdxId = _String(default="SPDXRef-DOCUMENT") + name = _String() + summary = _String() + description = _String() + creationInfo = _String() + verifiedUsing = _ObjectList(SPDX3IntegrityMethod) +# packages = _ObjectList(SPDXPackage) +# files = _ObjectList(SPDXFile) +# relationships = _ObjectList(SPDXRelationship) +# externalDocumentRefs = _ObjectList(SPDXExternalDocumentRef) +# hasExtractedLicensingInfos = _ObjectList(SPDXExtractedLicensingInfo) + + def serializer(self, rootElement): + """ + Default serialization of an Element + creationInfo is moved to the root and refered with its id + context and element defined in ElementCollection and Bundle are ignored + Element objects are ignored + other attributes are ordinary serialized + """ + main = {"type": self.__class__.__name__[len("SPDX3"):]} + + for (key, value) in self._spdx.items(): + if key == "creationInfo": + _id = rootElement.creationinfo(value) + main["creationInfo"] = _id + elif key not in ["context", "element"] \ + and not isinstance(value, SPDX3Element): + if key[0] == '_': + main.update({key[1:]: value}) + else: + main.update({key: value}) + return main + + def add_relationship(self, _from, relationship, _to): + if isinstance(_from, SPDX3Element): + from_spdxid = _from.spdxId + else: + from_spdxid = _from + + if isinstance(_to, SPDX3Element): + to_spdxid = _to.spdxId + else: + to_spdxid = _to + + for element in self.element: + if isinstance(element, SPDX3Relationship) \ + and element._from == from_spdxid \ + and element.relationshipType == relationship: + element.to.append(to_spdxid) + return element.spdxId + + r = SPDX3Relationship( + _from=from_spdxid, + relationshipType=relationship, + to = [to_spdxid] + ) + + self.element.append(r) + return r.spdxId + + def find_external_map(self, sourceDocumentNamespace): + for i in self.imports: + if i.definingDocument == sourceDocumentNamespace: + return i + +class SPDX3Relationship(SPDX3Element): + spdxId = _String(default="SPDXRef-Relationship") # TODO: increment id + _from = _String() + to = _StringList() + relationshipType = _String() + +class SPDX3Annotation(SPDX3Element): + spdxId = _String(default="SPDXRef-Annotation") # TODO: increment id + annotationType = _String() + statement = _String() + subject = _String() + +class SPDX3Agent(SPDX3Element): + pass + +class SPDX3Person(SPDX3Agent): + pass + +class SPDX3Organization(SPDX3Agent): + pass + +class SPDX3Tool(SPDX3Element): + pass + +class SPDX3Artifact(SPDX3Element): + suppliedBy = _ObjectList(SPDX3Agent) + +class SPDX3ElementCollection(SPDX3Element): + element = _ObjectList(SPDX3Element) + imports = _ObjectList(SPDX3ExternalMap) + +class SPDX3Bundle(SPDX3ElementCollection): + context = _String(default="") + +class SPDX3SpdxDocument(SPDX3Bundle): + documentNamespace = _String() # TODO: where is this definition ? + creationInfo = _Object(SPDX3CreationInfo) + + _spdxcounter = 1 + + def __init__(self): + self._spdxcreationinfo = {} + super().__init__() + + def creationinfo(self, c): + """ + Look for a creationInfo in the dictionary. If it does not exist, + create a unique id and append it if it does not exist. + Return the id. + """ + for (_id, info) in self._spdxcreationinfo.items(): + if c == info: + return _id + _id = "_:CreationInfo{}".format(SPDX3SpdxDocument._spdxcounter) + SPDX3SpdxDocument._spdxcounter += 1 + self._spdxcreationinfo[_id] = c + return _id + + def serializer(self, rootElement): + """ + Serialize a SpdxDocument element. + context has a specific serialization + attributes of type Element are moved to root + attributes are are ordinary serialized (context and element are ignored) + all elements are moved to root + """ + chunk = {"@context": [self.context, {}]} + root = super().serializer(rootElement) + chunk["@graph"] = [] + + body = [] + for (_, value) in self._spdx.items(): + if isinstance(value, SPDX3Element): + body.append(value.serializer(rootElement)) + + if len(self.element): + root["element"] = [] + for e in self.element: + root["element"].append(e.spdxId) + body.append(e.serializer(rootElement)) + + for (_id, c) in self._spdxcreationinfo.items(): + cser = {"@id": _id} + cser.update(c.serializer()) + chunk["@graph"].append(cser) + + chunk["@graph"].append(root) + chunk["@graph"] = chunk["@graph"] + body + + return chunk + + def to_json(self, f, *, sort_keys=False, indent=None, separators=None): + class Encoder(json.JSONEncoder): + def __init__(self, rootElement=None, **kwargs): + self.rootElement = rootElement + super(Encoder, self).__init__(**kwargs) + + def default(self, o): + if isinstance(o, SPDX3SpdxDocument): + return o.serializer(self.rootElement) + elif isinstance(o, SPDXObject): + root = o.serializer(self.rootElement) + return root + + return super().default(o) + + sha1 = hashlib.sha1() + for chunk in Encoder( + rootElement=self, + sort_keys=sort_keys, + indent=indent, + separators=separators, + ).iterencode(self): + chunk = chunk.encode("utf-8") + f.write(chunk) + sha1.update(chunk) + + return sha1.hexdigest() + + @classmethod + def from_json(cls, f, attributes=[]): + """ + Look into a json file for all objects of given type. Return the document + element and a dictionary of required objects. + """ + class Decoder(json.JSONDecoder): + def __init__(self, *args, **kwargs): + super().__init__(object_hook=self.object_hook, *args, **kwargs) + + def object_hook(self, d): + if 'type' in d.keys(): + if d['type'] in attributes or d['type'] == 'SpdxDocument': + return d + if '@graph' in d.keys(): + spdxDocument = None + attr = {a: [] for a in attributes} + for p in d['@graph']: + if p is not None: + if p['type'] == 'SpdxDocument': + spdxDocument = p + else: + attr[p['type']].append(p) + return (spdxDocument, attr) + + return json.load(f, cls=Decoder) + +# +# Profile: Software - Datatypes +# +SPDX3SoftwarePurpose = [ + "application", + "archive", + "bom", + "configuration", + "container", + "data", + "device", + "documentation", + "executable", + "file", + "firmware", + "framework", + "install", + "library", + "module", + "operatingSystem", + "patch", + "source", + "other" +] + +class SPDX3SoftwareArtifact(SPDX3Artifact): + primaryPurpose = _String() + additionalPurpose = _StringList() + +class SPDX3Package(SPDX3SoftwareArtifact): + packageVersion = _String() + homePage = _String() + downloadLocation = _String() + +class SPDX3File(SPDX3SoftwareArtifact): + pass + +# +# OpenEmbedded base class +# +class SPDX3Graph(SPDXObject): + # TODO: rework: graph should only have a list of objects and more + # intelligence in to_json + package = _Object(SPDX3Package) + creationInfo = _Object(SPDX3CreationInfo) + doc = _Object(SPDX3SpdxDocument) + tool = _Object(SPDX3Tool) + organization = _Object(SPDX3Organization) + person = _Object(SPDX3Person) + + def __init__(self, **d): + super().__init__(**d) + + + def to_json(self, f, *, sort_keys=False, indent=None, separators=None): + class Encoder(json.JSONEncoder): + def default(self, o): + if isinstance(o, SPDXObject): + return o.serializer() + + return super().default(o) + + sha1 = hashlib.sha1() + for chunk in Encoder( + sort_keys=sort_keys, + indent=indent, + separators=separators, + ).iterencode(self): + chunk = chunk.encode("utf-8") + f.write(chunk) + sha1.update(chunk) + + return sha1.hexdigest() From patchwork Thu Oct 26 10:48:45 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Marta Rybczynska X-Patchwork-Id: 32950 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id D8F7AC25B6F for ; Thu, 26 Oct 2023 10:51:25 +0000 (UTC) Received: from mail-wr1-f43.google.com (mail-wr1-f43.google.com [209.85.221.43]) by mx.groups.io with SMTP id smtpd.web10.197285.1698317483212584686 for ; Thu, 26 Oct 2023 03:51:23 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=A+ica3WG; spf=pass (domain: gmail.com, ip: 209.85.221.43, mailfrom: rybczynska@gmail.com) Received: by mail-wr1-f43.google.com with SMTP id ffacd0b85a97d-32d9552d765so558557f8f.2 for ; Thu, 26 Oct 2023 03:51:22 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1698317481; x=1698922281; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=CVMr8hHg5tuYOB312sAB163s4bCI/sSaj+bUXpFoTng=; b=A+ica3WGs+sXn5dYvCWoysWPEAcawICLrQWiz8ygT2TJlQMHwf1hh2wLjU55ecTWSQ h2p5B1h8a/YwvfbVgqSPohQ/PQmjLXWUoo+D3HrLnaP4t/JmnpJtLonjEYOnFMytwzrh rXO8jWii/5nYoTTzvfMu2z+NtlKbLga5LJqKfiX1fiSCdqM6iLQOOqJoN6HBdWPQDbpt AJo1UayYqufwzrYR+gNL/P278ObQPainbYu4kpI7QkGTeS3gVIPncno/pG/L9sDqjTZ1 wu8f2iquw9XLTe7IkykOWxM7Hb3Cz3iqh+ReAy69RdspxsGbgHmETjwMCxtZc8pHvzaH wabg== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1698317481; x=1698922281; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=CVMr8hHg5tuYOB312sAB163s4bCI/sSaj+bUXpFoTng=; b=fFonDckl/F5F77hN2FRFA65MiZX9fIPgJJeD0emG8eYHv8nTe5PErsCGCbsmFyKLS4 FSWSddp9svA9RJvA1eDc65G8AXkz/TS0mgk33mr4Wm1Y0LrHzd5T4mCMyOfS52hFkDST hUHgDyU/QC7tA20RcScX63gGlit1YR1UXBpwhhwrlaC0PEoik0nATrqZ2r1ZHeDIWD9W Z3zRCg99XLIoieyv0xfLhv+8O5HGqq2wri4hVx+y4CMjwO7ckMux7XgKRcOJtOkwEr75 J0Dr43WDGBi8RJmcvIz91ZAzzQ8i1bf1kl2WjShWOi3vEahz8kg5fJ5Sd7g1MDyPIJd0 wZ0A== X-Gm-Message-State: AOJu0Yyo7uXzxUjsSKkxvJGjLICqnt1ePWWSiRO9e9bjDg0w/tF1/yzE eKyprEP0SH3QgroEhxq3EoOF20lUZBbYxA== X-Google-Smtp-Source: AGHT+IEsT5AvnWc1OpBwiCu3A5znp0OBIuxOOYeDNWmv6Yah5yuc/MBah7iw3z7PzqueNsv0rh6Dxg== X-Received: by 2002:adf:cd0e:0:b0:32d:87df:6dea with SMTP id w14-20020adfcd0e000000b0032d87df6deamr14254992wrm.45.1698317481098; Thu, 26 Oct 2023 03:51:21 -0700 (PDT) Received: from localhost.localdomain ([31.32.81.187]) by smtp.gmail.com with ESMTPSA id f1-20020adff8c1000000b0032da75af3easm13936004wrq.80.2023.10.26.03.51.20 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Thu, 26 Oct 2023 03:51:20 -0700 (PDT) From: Marta Rybczynska X-Google-Original-From: Marta Rybczynska To: openembedded-core@lists.openembedded.org Cc: richard.purdie@linuxfoundation.org, Louis Rannou Subject: [RFC][OE-core 5/7] oe/sbom: search into json Date: Thu, 26 Oct 2023 12:48:45 +0200 Message-ID: <20231026105033.257971-6-marta.rybczynska@syslinbit.com> X-Mailer: git-send-email 2.42.0 In-Reply-To: <20231026105033.257971-1-marta.rybczynska@syslinbit.com> References: <20231026105033.257971-1-marta.rybczynska@syslinbit.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Thu, 26 Oct 2023 10:51:25 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/189716 From: Louis Rannou Create a function that search into a json-ld instead of completely loading it. Signed-off-by: Louis Rannou --- meta/lib/oe/sbom.py | 32 ++++++++++++++++++++++++++++++++ 1 file changed, 32 insertions(+) diff --git a/meta/lib/oe/sbom.py b/meta/lib/oe/sbom.py index 28db9cf719..21333c0a84 100644 --- a/meta/lib/oe/sbom.py +++ b/meta/lib/oe/sbom.py @@ -119,3 +119,35 @@ def read_doc(fn): doc = oe.spdx.SPDXDocument.from_json(f) return (doc, sha1.hexdigest()) + + +def search_doc(fn, attr_types=None): + """ + Look for all attributes in the given dictionary. Return the document + element, a dictionary of the required attributes and the sha1 of the file. + """ + import hashlib + import oe.spdx3 + import io + import contextlib + + @contextlib.contextmanager + def get_file(): + if isinstance(fn, io.IOBase): + yield fn + else: + with fn.open("rb") as f: + yield f + + with get_file() as f: + sha1 = hashlib.sha1() + while True: + chunk = f.read(4096) + if not chunk: + break + sha1.update(chunk) + + f.seek(0) + doc, attributes = oe.spdx3.SPDX3SpdxDocument.from_json(f, attr_types or []) + + return (doc, attributes, sha1.hexdigest()) From patchwork Thu Oct 26 10:48:46 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Marta Rybczynska X-Patchwork-Id: 32952 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id D83BBC27C46 for ; Thu, 26 Oct 2023 10:51:35 +0000 (UTC) Received: from mail-wr1-f53.google.com (mail-wr1-f53.google.com [209.85.221.53]) by mx.groups.io with SMTP id smtpd.web11.67715.1698317488051200511 for ; Thu, 26 Oct 2023 03:51:28 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=Z7zI4Bzt; spf=pass (domain: gmail.com, ip: 209.85.221.53, mailfrom: rybczynska@gmail.com) Received: by mail-wr1-f53.google.com with SMTP id ffacd0b85a97d-32003aae100so1158399f8f.0 for ; Thu, 26 Oct 2023 03:51:27 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1698317486; x=1698922286; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=ZTp5QA5TbDz9Q0jXvySpmALbirQj/0MbAjRJ8lkpiBo=; b=Z7zI4Bztv4+iFc6N4rRdSdwDZoqXLLBoOhzAEcLBYkHb74rohQP1DzVa5t0KDvrc/6 gXA946GbFcA0ih3nYrdbSiTqCuco3KEExr5n0SrAO4Q8KL9bFTMaB4Zj9uGwvIwqDrKL y6g9hGdfX/lJukOUlSZ6OEaCxETg+Xy3EfDCq/f3miSWmWMtC3Yq+qDmvigWW+3dRSVo r6K0YKQUhb75/Ll3xfqcvNUnchLgzmz0/VPrQBp6gqpmCPVDxrpx8f+lN6xcBzc5PPd1 aNN0+P8BrkGtu2YUD4WD5WQqH+FEsGsHzhQwBJ7weuqeL00vh6aV5m2xqEvvKo/FSpQV JcDA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1698317486; x=1698922286; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=ZTp5QA5TbDz9Q0jXvySpmALbirQj/0MbAjRJ8lkpiBo=; b=Q1u7rNPQgQVCaQfCyro9q74kwwymr897S0OVJaxUteHUmDbmtEOpPI+ZxNEcjvFp6b Bfn+Hsb9bA/9UkxuebKbkZ0hsTBnIstlXTPNLJibmyFWEbDeW8nGY5UjMYLusjgHS81+ M7UIXI4HV8xujFTxngbSOoFCVSmtnjot9srod7j8WtQ9GD32nttoFIjPIQyZmjUxXS7X 03EiTI8Nael5xZT1yOmjFnIOnrm1aBfD7hubqMlGUNMr75zcaJ5oYMCYwftK7IFMcHFL iHYmIIF928kyQEtAqMiWEwEr+fsftT5vQ+a0tI51KfzN1hN4FBZ4IqyZ8+C6vy/Y+IX9 OXYw== X-Gm-Message-State: AOJu0YzSwivHsVNEqJyusDV8R6i/a0IkrnPivn3mCZpOqB59XHwwjdhT FWExSc1dHolCZ9w3AMft8u6whqjwNPz1Vw== X-Google-Smtp-Source: AGHT+IE306tgMR9DmYZD1vMX48YNbiWgdMGkNC+K09dM/mOTqt4KnzfQ8ON7OegLcxz3Xk0NWKWAVg== X-Received: by 2002:a05:6000:2a9:b0:32d:dd36:3eab with SMTP id l9-20020a05600002a900b0032ddd363eabmr2531846wry.19.1698317485656; Thu, 26 Oct 2023 03:51:25 -0700 (PDT) Received: from localhost.localdomain ([31.32.81.187]) by smtp.gmail.com with ESMTPSA id f1-20020adff8c1000000b0032da75af3easm13936004wrq.80.2023.10.26.03.51.24 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Thu, 26 Oct 2023 03:51:24 -0700 (PDT) From: Marta Rybczynska X-Google-Original-From: Marta Rybczynska To: openembedded-core@lists.openembedded.org Cc: richard.purdie@linuxfoundation.org, Marta Rybczynska Subject: [RFC][OE-core 6/7] README.SPDX3: add file Date: Thu, 26 Oct 2023 12:48:46 +0200 Message-ID: <20231026105033.257971-7-marta.rybczynska@syslinbit.com> X-Mailer: git-send-email 2.42.0 In-Reply-To: <20231026105033.257971-1-marta.rybczynska@syslinbit.com> References: <20231026105033.257971-1-marta.rybczynska@syslinbit.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Thu, 26 Oct 2023 10:51:35 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/189717 Add a specific readme for SPDX3 with open questions and other notes related to the PoC. Signed-off-by: Marta Rybczynska --- README.SPDX3 | 42 ++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 42 insertions(+) create mode 100644 README.SPDX3 diff --git a/README.SPDX3 b/README.SPDX3 new file mode 100644 index 0000000000..57f98756ab --- /dev/null +++ b/README.SPDX3 @@ -0,0 +1,42 @@ +This repository contains the Proof-of-Concept code for SPDX3 support +in the Yocto Project. + +What does the code include: +* The SPDX3 generation with JSON-LD serialization, still using .json extension +* Implementations of the core, and software profiles + +Here are the known limitations: +* At the time of writing this code, the SPDX3 specification is still undergoing + changes. Especially, the root element has not been yet decided. Because of + that, the code might require changes when the final specification is + released. + +* Some parts of the SPDX3 require clarifications. Current issues: + - Software.Package.homepage is sometiemes also called homePage: need to + confirm spelling + - Core.Relationship.from needs special care in Python as it conflicts + with a built-in + - should suppliedBy be serialized by an array or as a single string? + - In examples, SpdxDocument has an attribute namespace. It does not in the + documentation + - what is the equivalent of the documentNamespace that was in 2.2? + +* SPDX3 introduces modular model, where content depends on the profile used. + The configuration of profiles to generate needs to be reworked. Today, + generation is gated by variables shared with SPDX2.2 code like + SPDX_INCLUDE_SOURCES. In SPDX3 it could be done by enabling specific + profiles and variables like SPDX3_ENABLE_LICENSING or SPDX3_ENABLE_SECURITY. + +* The implementation includes data similar to the YP SPDX 2.2 content. SPDX 3.0 + has additional profiles and fields that did not exist in the earier version. + The project needs a discussion on what is useful to include in the YP SPDX. + Additional profiles and classes might be implemented to carry that data. + +* The security profile implementation has been prototyped. However, some part + of the needed data is necessary from the cve-check database (for example: + CVSS). Obtaining the information is possible, but will require dependency on + the cve-check to download the database, then refactoring of the cve-check + database accesses so that they can be done from other classes while keeping + correct locks. Also, VulnAssessmentRelationship requires classification + of fixes as "Fixed", "NotAffected", while YP cve-check has only one category + for both. At the moment of writing this, there is a patch on the ML. From patchwork Thu Oct 26 10:48:47 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Marta Rybczynska X-Patchwork-Id: 32953 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from aws-us-west-2-korg-lkml-1.web.codeaurora.org (localhost.localdomain [127.0.0.1]) by smtp.lore.kernel.org (Postfix) with ESMTP id DF754C27C47 for ; Thu, 26 Oct 2023 10:51:35 +0000 (UTC) Received: from mail-wr1-f47.google.com (mail-wr1-f47.google.com [209.85.221.47]) by mx.groups.io with SMTP id smtpd.web10.197290.1698317491725620566 for ; Thu, 26 Oct 2023 03:51:32 -0700 Authentication-Results: mx.groups.io; dkim=pass header.i=@gmail.com header.s=20230601 header.b=fmzvmc0K; spf=pass (domain: gmail.com, ip: 209.85.221.47, mailfrom: rybczynska@gmail.com) Received: by mail-wr1-f47.google.com with SMTP id ffacd0b85a97d-32003aae100so1158433f8f.0 for ; Thu, 26 Oct 2023 03:51:31 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1698317489; x=1698922289; darn=lists.openembedded.org; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:from:to:cc:subject:date :message-id:reply-to; bh=dknR4BbEhNe7kkWU7ZZlLd6LMN4383RSMNF5RSntOqs=; b=fmzvmc0Ki29MfYjDw941yjwEMUreS/wPkIJcXmLfNKyGmMTK1q1oQmEhkeJzbKEOCD Ya8MaFr8SQt13Hqh2uigfDd9xddgKqSv9dKfxcoboE8lvCX7/V9ON515nivbI1Cx8PSV WdZMqzMcoxwM8pNIyXMYnV3ZyESJoeQz2yIV3vr9xH+oV7QHN7NzXUt+72+jfD8+dTIz Kuj2ACJeT9fiv7KsdNSJ32AF+Ey9u41zBv3w0uEFTzfMVuRSgDTvwihgwJpCuCRgxmHD Z5CB7Nwk4g8TwkQz7pQBK26iiQGMvjELXZb1P6JnceFGW9JE01/rCqdnI89EUlFFDlNt zP9Q== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1698317489; x=1698922289; h=content-transfer-encoding:mime-version:references:in-reply-to :message-id:date:subject:cc:to:from:x-gm-message-state:from:to:cc :subject:date:message-id:reply-to; bh=dknR4BbEhNe7kkWU7ZZlLd6LMN4383RSMNF5RSntOqs=; b=pRcpUtVqhHgkaWWidAhyhLqolIc/o38zeY0KWyFqt/HodI/pVOhIxjiZRgE/rTtHW3 xI8p6byjZ10rtVgisFsZM6xo9FdIsIwWVf3/fDh0D2R69XSw2DmI9ESSDquKyZpLHVcp bYzCv2EeiCeXcu33ksJiGH8jnI3uzjtSvpRvNbTplPpH1AN4IHMnrPqwcUF69miU64yD FAu7hSd92gnCRaLEPiaIJL21vOowHCDO65XXeCvPKeAsInaqfzgvds2f6Ttn6rGYjw4K /5ZKLUe2CKcMUG0TOYBIdztXJvKKlXoYl4zUqt/y6sAeXVilK8OzH5uHh/FsG6iAIvh7 iJmw== X-Gm-Message-State: AOJu0YwtcWHFsOeLcyOcS+rOsJIbldimQwyyZtpaqt4KXMHIg3rXmNxq nFa/NHGkUHe48z/ZoDODu+3vx4BdpLAjFg== X-Google-Smtp-Source: AGHT+IEdRRnY3O0gwL0m4OqpTBoOPgyPOCuZFMpYW9nzyJGpfE+GYKkt0Y52gtqlvYXY/pO5ixCF+Q== X-Received: by 2002:a5d:59a4:0:b0:32d:c5fd:159b with SMTP id p4-20020a5d59a4000000b0032dc5fd159bmr3005757wrr.4.1698317489258; Thu, 26 Oct 2023 03:51:29 -0700 (PDT) Received: from localhost.localdomain ([31.32.81.187]) by smtp.gmail.com with ESMTPSA id f1-20020adff8c1000000b0032da75af3easm13936004wrq.80.2023.10.26.03.51.28 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Thu, 26 Oct 2023 03:51:28 -0700 (PDT) From: Marta Rybczynska X-Google-Original-From: Marta Rybczynska To: openembedded-core@lists.openembedded.org Cc: richard.purdie@linuxfoundation.org, Samantha Jalabert Subject: [RFC][OE-core 7/7] create-spdx-3.0: support for recipe spdx creation Date: Thu, 26 Oct 2023 12:48:47 +0200 Message-ID: <20231026105033.257971-8-marta.rybczynska@syslinbit.com> X-Mailer: git-send-email 2.42.0 In-Reply-To: <20231026105033.257971-1-marta.rybczynska@syslinbit.com> References: <20231026105033.257971-1-marta.rybczynska@syslinbit.com> MIME-Version: 1.0 List-Id: X-Webhook-Received: from li982-79.members.linode.com [45.33.32.79] by aws-us-west-2-korg-lkml-1.web.codeaurora.org with HTTPS for ; Thu, 26 Oct 2023 10:51:35 -0000 X-Groupsio-URL: https://lists.openembedded.org/g/openembedded-core/message/189718 From: Samantha Jalabert Change functions and tasks to match the SPDX 3 model. Signed-off-by: Samantha Jalabert --- meta/classes/create-spdx-3.0.bbclass | 728 +++++++++------------------ 1 file changed, 224 insertions(+), 504 deletions(-) diff --git a/meta/classes/create-spdx-3.0.bbclass b/meta/classes/create-spdx-3.0.bbclass index b0aef80db1..ffe34969a8 100644 --- a/meta/classes/create-spdx-3.0.bbclass +++ b/meta/classes/create-spdx-3.0.bbclass @@ -11,7 +11,7 @@ DEPLOY_DIR_SPDX ??= "${DEPLOY_DIR}/spdx" CVE_PRODUCT ??= "${BPN}" CVE_VERSION ??= "${PV}" -SPDXDIR ??= "${WORKDIR}/spdx" +SPDXDIR ??= "${WORKDIR}/spdx-3.0" SPDXDEPLOY = "${SPDXDIR}/deploy" SPDXWORK = "${SPDXDIR}/work" SPDXIMAGEWORK = "${SPDXDIR}/image-work" @@ -64,21 +64,74 @@ def get_doc_namespace(d, doc): namespace_uuid = uuid.uuid5(uuid.NAMESPACE_DNS, d.getVar("SPDX_UUID_NAMESPACE")) return "%s/%s-%s" % (d.getVar("SPDX_NAMESPACE_PREFIX"), doc.name, str(uuid.uuid5(namespace_uuid, doc.name))) -def create_annotation(d, comment): +def generate_creationInfo(d, document): + """ + Generate the creationInfo and its elements for a document + """ from datetime import datetime, timezone + import oe.spdx3 creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") - annotation = oe.spdx.SPDXAnnotation() - annotation.annotationDate = creation_time - annotation.annotationType = "OTHER" - annotation.annotator = "Tool: %s - %s" % (d.getVar("SPDX_TOOL_NAME"), d.getVar("SPDX_TOOL_VERSION")) - annotation.comment = comment - return annotation + + document.creationInfo = oe.spdx3.SPDX3CreationInfo() + document.creationInfo.specVersion = "3.0.0" + document.creationInfo.created = creation_time + document.creationInfo.dataLicense = "https://spdx.org/licenses/CC0-1.0" + + tool = oe.spdx3.SPDX3Tool() + tool.name = "OpenEmbedded Core create-spdx.bbclass" + tool.spdxId = "spdx-" + d.getVar("PN") + ":SPDXRef-Actor-" + tool.name.replace(" ", "") + tool.creationInfo = document.creationInfo + document.element.append(tool) + document.creationInfo.createdUsing.append(tool) + + organization = oe.spdx3.SPDX3Organization() + organization.name = d.getVar("SPDX_ORG") + organization.spdxId = "spdx-" + d.getVar("PN") + ":SPDXRef-Actor-" + organization.name.replace(" ", "") + organization.creationInfo = document.creationInfo + document.element.append(organization) + document.creationInfo.createdBy.append(organization) + + person = oe.spdx3.SPDX3Person() + person.name = "Person: N/A ()" + person.spdxId = "spdx-" + d.getVar("PN") + ":SPDXRef-Actor-" + person.name.replace(" ", "") + document.creationInfo.createdBy.append(person) + document.element.append(person) + +def get_supplier(d, doc=None): + """ + Get the supplier of a document or create it. + """ + import oe.spdx3 + + supplier = d.getVar("SPDX_SUPPLIER") + agentName = supplier.split(": ")[1] + agentType = supplier.split(": ")[0] + + if doc: + for element in doc.element: + if(isinstance(element, oe.spdx3.SPDX3Agent) and element.name == agentName): + return element + + if(agentType == "Organization"): + agent = oe.spdx3.SPDX3Organization() + elif(agentType == "Person"): + agent = oe.spdx3.SPDX3Person() + else: + raise KeyError("%r is not a valid SPDX agent type" % agentType) + + agent.name = agentName + agent.spdxId = "spdx-" + d.getVar("PN") + ":SPDXRef-Actor-" + agent.name.replace(" ", "") + agent.creationInfo = doc.creationInfo + + return agent def recipe_spdx_is_native(d, recipe): - return any(a.annotationType == "OTHER" and - a.annotator == "Tool: %s - %s" % (d.getVar("SPDX_TOOL_NAME"), d.getVar("SPDX_TOOL_VERSION")) and - a.comment == "isNative" for a in recipe.annotations) + return False +# TODO: find a better way to mark native recipes +# return any(a.annotationType == "OTHER" and +# a.annotator == "Tool: %s - %s" % (d.getVar("SPDX_TOOL_NAME"), d.getVar("SPDX_TOOL_VERSION")) and +# a.comment == "isNative" for a in recipe.annotations) def is_work_shared_spdx(d): return bb.data.inherits_class('kernel', d) or ('work-shared' in d.getVar('WORKDIR')) @@ -113,7 +166,7 @@ def convert_license_to_spdx(lic, document, d, existing={}): if name in extracted: return - extracted_info = oe.spdx.SPDXExtractedLicensingInfo() + extracted_info = oe.spdx.SPDX3ExtractedLicensingInfo() extracted_info.name = name extracted_info.licenseId = ident extracted_info.extractedText = None @@ -202,8 +255,7 @@ def process_sources(d): def add_package_files(d, doc, spdx_pkg, topdir, get_spdxid, get_types, *, archive=None, ignore_dirs=[], ignore_top_level_dirs=[]): from pathlib import Path - import oe.spdx - import hashlib + import oe.spdx3 source_date_epoch = d.getVar("SOURCE_DATE_EPOCH") if source_date_epoch: @@ -223,11 +275,18 @@ def add_package_files(d, doc, spdx_pkg, topdir, get_spdxid, get_types, *, archiv filename = str(filepath.relative_to(topdir)) if not filepath.is_symlink() and filepath.is_file(): - spdx_file = oe.spdx.SPDXFile() - spdx_file.SPDXID = get_spdxid(file_counter) - for t in get_types(filepath): - spdx_file.fileTypes.append(t) - spdx_file.fileName = filename + spdx_file = oe.spdx3.SPDX3File() + spdx_file.name = filename + spdx_file.spdxId = get_spdxid(file_counter) + spdx_file.primaryPurpose = None + spdx_file.additionalPurpose = [] + types = get_types(filepath) + for t in types: + if t in oe.spdx3.SPDX3SoftwarePurpose: + if spdx_file.primaryPurpose == None: + spdx_file.primaryPurpose = t + else: + spdx_file.additionalPurpose.append(t) if archive is not None: with filepath.open("rb") as f: @@ -245,42 +304,37 @@ def add_package_files(d, doc, spdx_pkg, topdir, get_spdxid, get_types, *, archiv sha1 = bb.utils.sha1_file(filepath) sha1s.append(sha1) - spdx_file.checksums.append(oe.spdx.SPDXChecksum( - algorithm="SHA1", - checksumValue=sha1, - )) - spdx_file.checksums.append(oe.spdx.SPDXChecksum( - algorithm="SHA256", - checksumValue=bb.utils.sha256_file(filepath), - )) - - if "SOURCE" in spdx_file.fileTypes: - extracted_lics = extract_licenses(filepath) - if extracted_lics: - spdx_file.licenseInfoInFiles = extracted_lics - - doc.files.append(spdx_file) - doc.add_relationship(spdx_pkg, "CONTAINS", spdx_file) - spdx_pkg.hasFiles.append(spdx_file.SPDXID) - spdx_files.append(spdx_file) + hashSha1 = oe.spdx3.SPDX3Hash() + hashSha1.algorithm = "sha1" + hashSha1.hashValue = sha1 + spdx_file.verifiedUsing.append(hashSha1) - file_counter += 1 + hashSha256 = oe.spdx3.SPDX3Hash() + hashSha256.algorithm = "sha256" + hashSha256.hashValue = bb.utils.sha256_file(filepath) + spdx_file.verifiedUsing.append(hashSha256) + + # TODO: Rework when License Profile implemented + #if "SOURCE" in spdx_file.fileTypes: + # extracted_lics = extract_licenses(filepath) + # if extracted_lics: + # spdx_file.licenseInfoInFiles = extracted_lics - sha1s.sort() - verifier = hashlib.sha1() - for v in sha1s: - verifier.update(v.encode("utf-8")) - spdx_pkg.packageVerificationCode.packageVerificationCodeValue = verifier.hexdigest() + doc.element.append(spdx_file) + + doc.add_relationship(spdx_pkg, "contains", spdx_file) + + spdx_files.append(spdx_file) + file_counter += 1 return spdx_files def add_package_sources_from_debug(d, package_doc, spdx_package, package, package_files, sources): from pathlib import Path - import hashlib import oe.packagedata - import oe.spdx + import oe.spdx3 debug_search_paths = [ Path(d.getVar('PKGD')), @@ -299,15 +353,15 @@ def add_package_sources_from_debug(d, package_doc, spdx_package, package, packag continue for pkg_file in package_files: - if file_path.lstrip("/") == pkg_file.fileName.lstrip("/"): + if file_path.lstrip("/") == pkg_file.name.lstrip("/"): break else: bb.fatal("No package file found for %s in %s; SPDX found: %s" % (str(file_path), package, - " ".join(p.fileName for p in package_files))) + " ".join(p.name for p in package_files))) continue for debugsrc in file_data["debugsrc"]: - ref_id = "NOASSERTION" + ref_id = None for search in debug_search_paths: if debugsrc.startswith("/usr/src/kernel"): debugsrc_path = search / debugsrc.replace('/usr/src/kernel/', '') @@ -320,24 +374,32 @@ def add_package_sources_from_debug(d, package_doc, spdx_package, package, packag if file_sha256 in sources: source_file = sources[file_sha256] - - doc_ref = package_doc.find_external_document_ref(source_file.doc.documentNamespace) + doc_ref = package_doc.find_external_map(source_file.doc.documentNamespace) if doc_ref is None: - doc_ref = oe.spdx.SPDXExternalDocumentRef() - doc_ref.externalDocumentId = "DocumentRef-dependency-" + source_file.doc.name - doc_ref.spdxDocument = source_file.doc.documentNamespace - doc_ref.checksum.algorithm = "SHA1" - doc_ref.checksum.checksumValue = source_file.doc_sha1 - package_doc.externalDocumentRefs.append(doc_ref) - - ref_id = "%s:%s" % (doc_ref.externalDocumentId, source_file.file.SPDXID) + doc_ref = oe.spdx3.SPDX3ExternalMap() + doc_ref.externalId = "DocumentRef-dependency-" + source_file.doc.name + doc_ref.verifiedUsing = oe.spdx3.SPDX3Hash() + doc_ref.verifiedUsing.algorithm = "sha1" + doc_ref.verifiedUsing.hashValue = source_file.doc_sha1 + doc_ref.definingDocument = source_file.doc.documentNamespace + + package_doc.imports.append(doc_ref) + + ref_id = "%s:%s" % (doc_ref.externalId, source_file.file.spdxId) else: bb.debug(1, "Debug source %s with SHA256 %s not found in any dependency" % (str(debugsrc_path), file_sha256)) break else: bb.debug(1, "Debug source %s not found" % debugsrc) - package_doc.add_relationship(pkg_file, "GENERATED_FROM", ref_id, comment=debugsrc) + relation_id = package_doc.add_relationship(ref_id, "generates", pkg_file) + comment = oe.spdx3.SPDX3Annotation() + comment.subject = relation_id + comment.annotationType = "other" + comment.statement = "debugsrc" + package_doc.element.append(comment) + + return add_package_sources_from_debug[vardepsexclude] += "STAGING_KERNEL_DIR" @@ -345,7 +407,7 @@ def collect_dep_recipes(d, doc, spdx_recipe): import json from pathlib import Path import oe.sbom - import oe.spdx + import oe.spdx3 deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) spdx_deps_file = Path(d.getVar("SPDXDEPS")) @@ -362,10 +424,10 @@ def collect_dep_recipes(d, doc, spdx_recipe): if not dep_recipe_path: bb.fatal("Cannot find any SPDX file for recipe %s, %s" % (dep_pn, dep_hashfn)) - spdx_dep_doc, spdx_dep_sha1 = oe.sbom.read_doc(dep_recipe_path) + spdx_dep_doc, spdx_dep_pkg, spdx_dep_sha1 = oe.sbom.search_doc(dep_recipe_path, ["Package"]) - for pkg in spdx_dep_doc.packages: - if pkg.name == dep_pn: + for pkg in spdx_dep_pkg['Package']: + if pkg["name"] == dep_pn: spdx_dep_recipe = pkg break else: @@ -373,19 +435,15 @@ def collect_dep_recipes(d, doc, spdx_recipe): dep_recipes.append(oe.sbom.DepRecipe(spdx_dep_doc, spdx_dep_sha1, spdx_dep_recipe)) - dep_recipe_ref = oe.spdx.SPDXExternalDocumentRef() - dep_recipe_ref.externalDocumentId = "DocumentRef-dependency-" + spdx_dep_doc.name - dep_recipe_ref.spdxDocument = spdx_dep_doc.documentNamespace - dep_recipe_ref.checksum.algorithm = "SHA1" - dep_recipe_ref.checksum.checksumValue = spdx_dep_sha1 - - doc.externalDocumentRefs.append(dep_recipe_ref) + dep_recipe_ref = oe.spdx3.SPDX3ExternalMap() + dep_recipe_ref.externalId = "DocumentRef-%s" % spdx_dep_doc["name"] + hashSha1 = oe.spdx3.SPDX3Hash() + hashSha1.algorithm = "sha1" + hashSha1.hashValue = spdx_dep_sha1 + dep_recipe_ref.verifiedUsing.append(hashSha1) - doc.add_relationship( - "%s:%s" % (dep_recipe_ref.externalDocumentId, spdx_dep_recipe.SPDXID), - "BUILD_DEPENDENCY_OF", - spdx_recipe - ) + doc.imports.append(dep_recipe_ref) + doc.add_relationship("%s:%s" % (dep_recipe_ref.externalId, spdx_dep_recipe["spdxId"]), "buildDependency", spdx_recipe) return dep_recipes @@ -393,24 +451,35 @@ collect_dep_recipes[vardepsexclude] = "SSTATE_ARCHS" def collect_dep_sources(d, dep_recipes): import oe.sbom + import oe.spdx3 sources = {} for dep in dep_recipes: # Don't collect sources from native recipes as they # match non-native sources also. - if recipe_spdx_is_native(d, dep.recipe): - continue - recipe_files = set(dep.recipe.hasFiles) - - for spdx_file in dep.doc.files: - if spdx_file.SPDXID not in recipe_files: - continue + if hasattr(dep.doc, "element"): + for element in dep.doc.element: + if isinstance(element, oe.spdx3.SPDX3Annotation) \ + and element.subject == dep.recipe.spdxId \ + and element.statement == "isNative": + continue - if "SOURCE" in spdx_file.fileTypes: - for checksum in spdx_file.checksums: - if checksum.algorithm == "SHA256": - sources[checksum.checksumValue] = oe.sbom.DepSource(dep.doc, dep.doc_sha1, dep.recipe, spdx_file) - break + recipe_files = [] + + if hasattr(dep.doc, "element"): + for element in dep.doc.element: + if isinstance(element, oe.spdx3.SPDX3Relationship) and element._from == dep.recipe.spdxId and element.relationshipType == "contains": + recipe_files = element.to + + for element in dep.doc.element: + if isinstance(element, oe.spdx3.SPDX3File) \ + and element.spdxId not in recipe_files \ + and (element.primaryPurpose == "source" or "source" in element.additionalPurpose): + for checksum in element.verifiedUsing: + if algorithm in checksum.properties() \ + and checksum.algorithm == "sha256": + sources[checksum.hashValue] = oe.sbom.DepSource(dep.doc, dep.doc_sha1, dep.recipe, spdx_file) + break return sources @@ -418,16 +487,16 @@ def add_download_packages(d, doc, recipe): import os.path from bb.fetch2 import decodeurl, CHECKSUM_LIST import bb.process - import oe.spdx + import oe.spdx3 import oe.sbom for download_idx, src_uri in enumerate(d.getVar('SRC_URI').split()): f = bb.fetch2.FetchData(src_uri, d) for name in f.names: - package = oe.spdx.SPDXPackage() + package = oe.spdx3.SPDX3Package() package.name = "%s-source-%d" % (d.getVar("PN"), download_idx + 1) - package.SPDXID = oe.sbom.get_download_spdxid(d, download_idx + 1) + package.spdxId = oe.sbom.get_download_spdxid(d, download_idx + 1) if f.type == "file": continue @@ -443,42 +512,28 @@ def add_download_packages(d, doc, recipe): if f.method.supports_checksum(f): for checksum_id in CHECKSUM_LIST: - if checksum_id.upper() not in oe.spdx.SPDXPackage.ALLOWED_CHECKSUMS: + if checksum_id not in oe.spdx3.SPDX3HashAlgorithm: continue expected_checksum = getattr(f, "%s_expected" % checksum_id) if expected_checksum is None: continue - c = oe.spdx.SPDXChecksum() + c = oe.spdx3.SPDX3Hash() c.algorithm = checksum_id.upper() - c.checksumValue = expected_checksum - package.checksums.append(c) + c.hashValue = expected_checksum + package.verifiedUsing.append(c) package.downloadLocation = uri - doc.packages.append(package) - doc.add_relationship(doc, "DESCRIBES", package) - # In the future, we might be able to do more fancy dependencies, - # but this should be sufficient for now - doc.add_relationship(package, "BUILD_DEPENDENCY_OF", recipe) + doc.element.append(package) -def collect_direct_deps(d, dep_task): - current_task = "do_" + d.getVar("BB_CURRENTTASK") - pn = d.getVar("PN") + doc.add_relationship(doc, "describes", package) + doc.add_relationship(package, "buildDependency", recipe) - taskdepdata = d.getVar("BB_TASKDEPDATA", False) - for this_dep in taskdepdata.values(): - if this_dep[0] == pn and this_dep[1] == current_task: - break - else: - bb.fatal(f"Unable to find this {pn}:{current_task} in taskdepdata") +def collect_direct_deps(d, dep_task): deps = set() - for dep_name in this_dep[3]: - dep_data = taskdepdata[dep_name] - if dep_data[1] == dep_task and dep_data[0] != pn: - deps.add((dep_data[0], dep_data[7])) return sorted(deps) @@ -509,9 +564,8 @@ do_collect_spdx_deps[deptask] = "do_create_spdx" do_collect_spdx_deps[dirs] = "${SPDXDIR}" python do_create_spdx() { - from datetime import datetime, timezone import oe.sbom - import oe.spdx + import oe.spdx3 import uuid from pathlib import Path from contextlib import contextmanager @@ -538,36 +592,34 @@ python do_create_spdx() { include_sources = d.getVar("SPDX_INCLUDE_SOURCES") == "1" archive_sources = d.getVar("SPDX_ARCHIVE_SOURCES") == "1" archive_packaged = d.getVar("SPDX_ARCHIVE_PACKAGED") == "1" - pkg_arch = d.getVar("SSTATE_PKGARCH") - - creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") - doc = oe.spdx.SPDXDocument() + doc = oe.spdx3.SPDX3SpdxDocument() doc.name = "recipe-" + d.getVar("PN") doc.documentNamespace = get_doc_namespace(d, doc) - doc.creationInfo.created = creation_time - doc.creationInfo.comment = "This document was created by analyzing recipe files during the build." - doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] - doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") - doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) - doc.creationInfo.creators.append("Person: N/A ()") - - recipe = oe.spdx.SPDXPackage() + generate_creationInfo(d, doc) + + recipe = oe.spdx3.SPDX3Package() + recipe.spdxId = oe.sbom.get_recipe_spdxid(d) recipe.name = d.getVar("PN") - recipe.versionInfo = d.getVar("PV") - recipe.SPDXID = oe.sbom.get_recipe_spdxid(d) - recipe.supplier = d.getVar("SPDX_SUPPLIER") + recipe.packageVersion = d.getVar("PV") + recipe.suppliedBy.append(get_supplier(d, doc)) + if bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d): - recipe.annotations.append(create_annotation(d, "isNative")) + comment = oe.spdx3.SPDX3Annotation() + comment.annotationType = "other" + comment.subject = recipe.spdxId + comment.statement = "isNative" + + doc.element.append(comment) homepage = d.getVar("HOMEPAGE") if homepage: - recipe.homepage = homepage - - license = d.getVar("LICENSE") - if license: - recipe.licenseDeclared = convert_license_to_spdx(license, doc, d) + recipe.homePage = homepage +# TODO: Rework when License Profile implemented +# license = d.getVar("LICENSE") +# if license: +# recipe.licenseDeclared = convert_license_to_spdx(license, doc, d) summary = d.getVar("SUMMARY") if summary: @@ -581,26 +633,11 @@ python do_create_spdx() { for var in d.getVar('SPDX_CUSTOM_ANNOTATION_VARS').split(): recipe.annotations.append(create_annotation(d, var + "=" + d.getVar(var))) - # Some CVEs may be patched during the build process without incrementing the version number, - # so querying for CVEs based on the CPE id can lead to false positives. To account for this, - # save the CVEs fixed by patches to source information field in the SPDX. - patched_cves = oe.cve_check.get_patched_cves(d) - patched_cves = list(patched_cves) - patched_cves = ' '.join(patched_cves) - if patched_cves: - recipe.sourceInfo = "CVEs fixed: " + patched_cves - - cpe_ids = oe.cve_check.get_cpe_ids(d.getVar("CVE_PRODUCT"), d.getVar("CVE_VERSION")) - if cpe_ids: - for cpe_id in cpe_ids: - cpe = oe.spdx.SPDXExternalReference() - cpe.referenceCategory = "SECURITY" - cpe.referenceType = "http://spdx.org/rdf/references/cpe23Type" - cpe.referenceLocator = cpe_id - recipe.externalRefs.append(cpe) - - doc.packages.append(recipe) - doc.add_relationship(doc, "DESCRIBES", recipe) + # TODO: CVE handling + + doc.element.append(recipe) + + doc.add_relationship(doc, "describes", recipe) add_download_packages(d, doc, recipe) @@ -615,7 +652,7 @@ python do_create_spdx() { recipe, spdx_workdir, lambda file_counter: "SPDXRef-SourceFile-%s-%d" % (d.getVar("PN"), file_counter), - lambda filepath: ["SOURCE"], + lambda filepath: ["source"], ignore_dirs=[".git"], ignore_top_level_dirs=["temp"], archive=archive, @@ -626,17 +663,13 @@ python do_create_spdx() { dep_recipes = collect_dep_recipes(d, doc, recipe) - doc_sha1 = oe.sbom.write_doc(d, doc, pkg_arch, "recipes", indent=get_json_indent(d)) + doc_sha1 = oe.sbom.write_doc(d, doc, doc, d.getVar("SSTATE_PKGARCH"), "recipes", indent=get_json_indent(d)) dep_recipes.append(oe.sbom.DepRecipe(doc, doc_sha1, recipe)) - recipe_ref = oe.spdx.SPDXExternalDocumentRef() - recipe_ref.externalDocumentId = "DocumentRef-recipe-" + recipe.name - recipe_ref.spdxDocument = doc.documentNamespace - recipe_ref.checksum.algorithm = "SHA1" - recipe_ref.checksum.checksumValue = doc_sha1 + #TODO: references sources = collect_dep_sources(d, dep_recipes) - found_licenses = {license.name:recipe_ref.externalDocumentId + ":" + license.licenseId for license in doc.hasExtractedLicensingInfos} +# found_licenses = {license.name:recipe_ref.externalDocumentId + ":" + license.licenseId for license in doc.hasExtractedLicensingInfos} if not recipe_spdx_is_native(d, recipe): bb.build.exec_func("read_subpackage_metadata", d) @@ -646,42 +679,41 @@ python do_create_spdx() { if not oe.packagedata.packaged(package, d): continue - package_doc = oe.spdx.SPDXDocument() + doc = oe.spdx3.SPDX3SpdxDocument() pkg_name = d.getVar("PKG:%s" % package) or package - package_doc.name = pkg_name - package_doc.documentNamespace = get_doc_namespace(d, package_doc) - package_doc.creationInfo.created = creation_time - package_doc.creationInfo.comment = "This document was created by analyzing packages created during the build." - package_doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] - package_doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") - package_doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) - package_doc.creationInfo.creators.append("Person: N/A ()") - package_doc.externalDocumentRefs.append(recipe_ref) + doc.name = pkg_name + doc.documentNamespace = get_doc_namespace(d, doc) + generate_creationInfo(d, doc) + + # TODO: Rework when License Profile implemented + # package_doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] + # package_doc.externalDocumentRefs.append(recipe_ref) package_license = d.getVar("LICENSE:%s" % package) or d.getVar("LICENSE") - spdx_package = oe.spdx.SPDXPackage() + spdx_package = oe.spdx3.SPDX3Package() - spdx_package.SPDXID = oe.sbom.get_package_spdxid(pkg_name) + spdx_package.spdxId = oe.sbom.get_package_spdxid(pkg_name) spdx_package.name = pkg_name - spdx_package.versionInfo = d.getVar("PV") - spdx_package.licenseDeclared = convert_license_to_spdx(package_license, package_doc, d, found_licenses) - spdx_package.supplier = d.getVar("SPDX_SUPPLIER") + spdx_package.packageVersion = d.getVar("PV") + # TODO: Rework when License Profile implemented + #spdx_package.licenseDeclared = convert_license_to_spdx(package_license, package_doc, d, found_licenses) + spdx_package.suppliedBy = [ d.getVar("SPDX_SUPPLIER") ] - package_doc.packages.append(spdx_package) + doc.element.append(spdx_package) - package_doc.add_relationship(spdx_package, "GENERATED_FROM", "%s:%s" % (recipe_ref.externalDocumentId, recipe.SPDXID)) - package_doc.add_relationship(package_doc, "DESCRIBES", spdx_package) + doc.add_relationship(recipe, "generates", spdx_package) + doc.add_relationship(doc, "describes", spdx_package) - package_archive = deploy_dir_spdx / "packages" / (package_doc.name + ".tar.zst") + package_archive = deploy_dir_spdx / "packages" / (doc.name + ".tar.zst") with optional_tarfile(package_archive, archive_packaged) as archive: package_files = add_package_files( d, - package_doc, + doc, spdx_package, pkgdest / package, lambda file_counter: oe.sbom.get_packaged_file_spdxid(pkg_name, file_counter), - lambda filepath: ["BINARY"], + lambda filepath: ["executable"], ignore_top_level_dirs=['CONTROL', 'DEBIAN'], archive=archive, ) @@ -689,9 +721,9 @@ python do_create_spdx() { if archive is not None: spdx_package.packageFileName = str(package_archive.name) - add_package_sources_from_debug(d, package_doc, spdx_package, package, package_files, sources) + add_package_sources_from_debug(d, doc, spdx_package, package, package_files, sources) - oe.sbom.write_doc(d, package_doc, pkg_arch, "packages", indent=get_json_indent(d)) + oe.sbom.write_doc(d, doc, doc, d.getVar("SSTATE_PKGARCH"), "packages", indent=get_json_indent(d)) } do_create_spdx[vardepsexclude] += "BB_NUMBER_THREADS" # NOTE: depending on do_unpack is a hack that is necessary to get it's dependencies for archive the source @@ -749,127 +781,11 @@ def collect_package_providers(d): collect_package_providers[vardepsexclude] += "BB_TASKDEPDATA" python do_create_runtime_spdx() { - from datetime import datetime, timezone - import oe.sbom - import oe.spdx - import oe.packagedata - from pathlib import Path - - deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) - spdx_deploy = Path(d.getVar("SPDXRUNTIMEDEPLOY")) - is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d) - - creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") - - providers = collect_package_providers(d) - pkg_arch = d.getVar("SSTATE_PKGARCH") - package_archs = d.getVar("SSTATE_ARCHS").split() - package_archs.reverse() - - if not is_native: - bb.build.exec_func("read_subpackage_metadata", d) - - dep_package_cache = {} - - pkgdest = Path(d.getVar("PKGDEST")) - for package in d.getVar("PACKAGES").split(): - localdata = bb.data.createCopy(d) - pkg_name = d.getVar("PKG:%s" % package) or package - localdata.setVar("PKG", pkg_name) - localdata.setVar('OVERRIDES', d.getVar("OVERRIDES", False) + ":" + package) - - if not oe.packagedata.packaged(package, localdata): - continue - - pkg_spdx_path = oe.sbom.doc_path(deploy_dir_spdx, pkg_name, pkg_arch, "packages") - - package_doc, package_doc_sha1 = oe.sbom.read_doc(pkg_spdx_path) - - for p in package_doc.packages: - if p.name == pkg_name: - spdx_package = p - break - else: - bb.fatal("Package '%s' not found in %s" % (pkg_name, pkg_spdx_path)) - - runtime_doc = oe.spdx.SPDXDocument() - runtime_doc.name = "runtime-" + pkg_name - runtime_doc.documentNamespace = get_doc_namespace(localdata, runtime_doc) - runtime_doc.creationInfo.created = creation_time - runtime_doc.creationInfo.comment = "This document was created by analyzing package runtime dependencies." - runtime_doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] - runtime_doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") - runtime_doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) - runtime_doc.creationInfo.creators.append("Person: N/A ()") - - package_ref = oe.spdx.SPDXExternalDocumentRef() - package_ref.externalDocumentId = "DocumentRef-package-" + package - package_ref.spdxDocument = package_doc.documentNamespace - package_ref.checksum.algorithm = "SHA1" - package_ref.checksum.checksumValue = package_doc_sha1 - - runtime_doc.externalDocumentRefs.append(package_ref) - - runtime_doc.add_relationship( - runtime_doc.SPDXID, - "AMENDS", - "%s:%s" % (package_ref.externalDocumentId, package_doc.SPDXID) - ) - - deps = bb.utils.explode_dep_versions2(localdata.getVar("RDEPENDS") or "") - seen_deps = set() - for dep, _ in deps.items(): - if dep in seen_deps: - continue - - if dep not in providers: - continue - - (dep, dep_hashfn) = providers[dep] - - if not oe.packagedata.packaged(dep, localdata): - continue - - dep_pkg_data = oe.packagedata.read_subpkgdata_dict(dep, d) - dep_pkg = dep_pkg_data["PKG"] - - if dep in dep_package_cache: - (dep_spdx_package, dep_package_ref) = dep_package_cache[dep] - else: - dep_path = oe.sbom.doc_find_by_hashfn(deploy_dir_spdx, package_archs, dep_pkg, dep_hashfn) - if not dep_path: - bb.fatal("No SPDX file found for package %s, %s" % (dep_pkg, dep_hashfn)) - - spdx_dep_doc, spdx_dep_sha1 = oe.sbom.read_doc(dep_path) - - for pkg in spdx_dep_doc.packages: - if pkg.name == dep_pkg: - dep_spdx_package = pkg - break - else: - bb.fatal("Package '%s' not found in %s" % (dep_pkg, dep_path)) - - dep_package_ref = oe.spdx.SPDXExternalDocumentRef() - dep_package_ref.externalDocumentId = "DocumentRef-runtime-dependency-" + spdx_dep_doc.name - dep_package_ref.spdxDocument = spdx_dep_doc.documentNamespace - dep_package_ref.checksum.algorithm = "SHA1" - dep_package_ref.checksum.checksumValue = spdx_dep_sha1 - - dep_package_cache[dep] = (dep_spdx_package, dep_package_ref) - - runtime_doc.externalDocumentRefs.append(dep_package_ref) - - runtime_doc.add_relationship( - "%s:%s" % (dep_package_ref.externalDocumentId, dep_spdx_package.SPDXID), - "RUNTIME_DEPENDENCY_OF", - "%s:%s" % (package_ref.externalDocumentId, spdx_package.SPDXID) - ) - seen_deps.add(dep) - - oe.sbom.write_doc(d, runtime_doc, pkg_arch, "runtime", spdx_deploy, indent=get_json_indent(d)) + # TODO: implement for SPDX3 + return } -do_create_runtime_spdx[vardepsexclude] += "OVERRIDES SSTATE_ARCHS" +do_create_runtime_spdx[vardepsexclude] += "OVERRIDES" addtask do_create_runtime_spdx after do_create_spdx before do_build do_rm_work SSTATETASKS += "do_create_runtime_spdx" @@ -950,209 +866,13 @@ POPULATE_SDK_POST_HOST_COMMAND:append:task-populate-sdk = " sdk_host_combine_spd POPULATE_SDK_POST_TARGET_COMMAND:append:task-populate-sdk = " sdk_target_combine_spdx" python image_combine_spdx() { - import os - import oe.sbom - from pathlib import Path - from oe.rootfs import image_list_installed_packages - - image_name = d.getVar("IMAGE_NAME") - image_link_name = d.getVar("IMAGE_LINK_NAME") - imgdeploydir = Path(d.getVar("IMGDEPLOYDIR")) - img_spdxid = oe.sbom.get_image_spdxid(image_name) - packages = image_list_installed_packages(d) - - combine_spdx(d, image_name, imgdeploydir, img_spdxid, packages, Path(d.getVar("SPDXIMAGEWORK"))) - - def make_image_link(target_path, suffix): - if image_link_name: - link = imgdeploydir / (image_link_name + suffix) - if link != target_path: - link.symlink_to(os.path.relpath(target_path, link.parent)) - - spdx_tar_path = imgdeploydir / (image_name + ".spdx.tar.zst") - make_image_link(spdx_tar_path, ".spdx.tar.zst") + return } python sdk_host_combine_spdx() { - sdk_combine_spdx(d, "host") + return } python sdk_target_combine_spdx() { - sdk_combine_spdx(d, "target") + return } - -def sdk_combine_spdx(d, sdk_type): - import oe.sbom - from pathlib import Path - from oe.sdk import sdk_list_installed_packages - - sdk_name = d.getVar("TOOLCHAIN_OUTPUTNAME") + "-" + sdk_type - sdk_deploydir = Path(d.getVar("SDKDEPLOYDIR")) - sdk_spdxid = oe.sbom.get_sdk_spdxid(sdk_name) - sdk_packages = sdk_list_installed_packages(d, sdk_type == "target") - combine_spdx(d, sdk_name, sdk_deploydir, sdk_spdxid, sdk_packages, Path(d.getVar('SPDXSDKWORK'))) - -def combine_spdx(d, rootfs_name, rootfs_deploydir, rootfs_spdxid, packages, spdx_workdir): - import os - import oe.spdx - import oe.sbom - import io - import json - from datetime import timezone, datetime - from pathlib import Path - import tarfile - import bb.compress.zstd - - providers = collect_package_providers(d) - package_archs = d.getVar("SSTATE_ARCHS").split() - package_archs.reverse() - - creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") - deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) - source_date_epoch = d.getVar("SOURCE_DATE_EPOCH") - - doc = oe.spdx.SPDXDocument() - doc.name = rootfs_name - doc.documentNamespace = get_doc_namespace(d, doc) - doc.creationInfo.created = creation_time - doc.creationInfo.comment = "This document was created by analyzing the source of the Yocto recipe during the build." - doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] - doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") - doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) - doc.creationInfo.creators.append("Person: N/A ()") - - image = oe.spdx.SPDXPackage() - image.name = d.getVar("PN") - image.versionInfo = d.getVar("PV") - image.SPDXID = rootfs_spdxid - image.supplier = d.getVar("SPDX_SUPPLIER") - - doc.packages.append(image) - - for name in sorted(packages.keys()): - if name not in providers: - bb.fatal("Unable to find SPDX provider for '%s'" % name) - - pkg_name, pkg_hashfn = providers[name] - - pkg_spdx_path = oe.sbom.doc_find_by_hashfn(deploy_dir_spdx, package_archs, pkg_name, pkg_hashfn) - if not pkg_spdx_path: - bb.fatal("No SPDX file found for package %s, %s" % (pkg_name, pkg_hashfn)) - - pkg_doc, pkg_doc_sha1 = oe.sbom.read_doc(pkg_spdx_path) - - for p in pkg_doc.packages: - if p.name == name: - pkg_ref = oe.spdx.SPDXExternalDocumentRef() - pkg_ref.externalDocumentId = "DocumentRef-%s" % pkg_doc.name - pkg_ref.spdxDocument = pkg_doc.documentNamespace - pkg_ref.checksum.algorithm = "SHA1" - pkg_ref.checksum.checksumValue = pkg_doc_sha1 - - doc.externalDocumentRefs.append(pkg_ref) - doc.add_relationship(image, "CONTAINS", "%s:%s" % (pkg_ref.externalDocumentId, p.SPDXID)) - break - else: - bb.fatal("Unable to find package with name '%s' in SPDX file %s" % (name, pkg_spdx_path)) - - runtime_spdx_path = oe.sbom.doc_find_by_hashfn(deploy_dir_spdx, package_archs, "runtime-" + name, pkg_hashfn) - if not runtime_spdx_path: - bb.fatal("No runtime SPDX document found for %s, %s" % (name, pkg_hashfn)) - - runtime_doc, runtime_doc_sha1 = oe.sbom.read_doc(runtime_spdx_path) - - runtime_ref = oe.spdx.SPDXExternalDocumentRef() - runtime_ref.externalDocumentId = "DocumentRef-%s" % runtime_doc.name - runtime_ref.spdxDocument = runtime_doc.documentNamespace - runtime_ref.checksum.algorithm = "SHA1" - runtime_ref.checksum.checksumValue = runtime_doc_sha1 - - # "OTHER" isn't ideal here, but I can't find a relationship that makes sense - doc.externalDocumentRefs.append(runtime_ref) - doc.add_relationship( - image, - "OTHER", - "%s:%s" % (runtime_ref.externalDocumentId, runtime_doc.SPDXID), - comment="Runtime dependencies for %s" % name - ) - - image_spdx_path = spdx_workdir / (rootfs_name + ".spdx.json") - - with image_spdx_path.open("wb") as f: - doc.to_json(f, sort_keys=True, indent=get_json_indent(d)) - - num_threads = int(d.getVar("BB_NUMBER_THREADS")) - - visited_docs = set() - - index = {"documents": []} - - spdx_tar_path = rootfs_deploydir / (rootfs_name + ".spdx.tar.zst") - with bb.compress.zstd.open(spdx_tar_path, "w", num_threads=num_threads) as f: - with tarfile.open(fileobj=f, mode="w|") as tar: - def collect_spdx_document(path): - nonlocal tar - nonlocal deploy_dir_spdx - nonlocal source_date_epoch - nonlocal index - - if path in visited_docs: - return - - visited_docs.add(path) - - with path.open("rb") as f: - doc, sha1 = oe.sbom.read_doc(f) - f.seek(0) - - if doc.documentNamespace in visited_docs: - return - - bb.note("Adding SPDX document %s" % path) - visited_docs.add(doc.documentNamespace) - info = tar.gettarinfo(fileobj=f) - - info.name = doc.name + ".spdx.json" - info.uid = 0 - info.gid = 0 - info.uname = "root" - info.gname = "root" - - if source_date_epoch is not None and info.mtime > int(source_date_epoch): - info.mtime = int(source_date_epoch) - - tar.addfile(info, f) - - index["documents"].append({ - "filename": info.name, - "documentNamespace": doc.documentNamespace, - "sha1": sha1, - }) - - for ref in doc.externalDocumentRefs: - ref_path = oe.sbom.doc_find_by_namespace(deploy_dir_spdx, package_archs, ref.spdxDocument) - if not ref_path: - bb.fatal("Cannot find any SPDX file for document %s" % ref.spdxDocument) - collect_spdx_document(ref_path) - - collect_spdx_document(image_spdx_path) - - index["documents"].sort(key=lambda x: x["filename"]) - - index_str = io.BytesIO(json.dumps( - index, - sort_keys=True, - indent=get_json_indent(d), - ).encode("utf-8")) - - info = tarfile.TarInfo() - info.name = "index.json" - info.size = len(index_str.getvalue()) - info.uid = 0 - info.gid = 0 - info.uname = "root" - info.gname = "root" - - tar.addfile(info, fileobj=index_str) - -combine_spdx[vardepsexclude] += "BB_NUMBER_THREADS SSTATE_ARCHS"