Date   

Re: [PATCH v2 3/3] OvmfPkg/X86QemuLoadImageLib: State fw_cfg dependency in file header

Laszlo Ersek
 

On 06/17/21 14:17, Dov Murik wrote:
Make it clear that X86QemuLoadImageLib relies on fw_cfg; prepare the
ground to add a warning about the incompatibility with boot verification
process.

Cc: Laszlo Ersek <lersek@redhat.com>
Cc: Ard Biesheuvel <ardb+tianocore@kernel.org>
Cc: Jordan Justen <jordan.l.justen@intel.com>
Cc: James Bottomley <jejb@linux.ibm.com>
Cc: Tobin Feldman-Fitzthum <tobin@linux.ibm.com>
Signed-off-by: Dov Murik <dovmurik@linux.ibm.com>
---
OvmfPkg/Library/X86QemuLoadImageLib/X86QemuLoadImageLib.inf | 3 +++
OvmfPkg/Library/X86QemuLoadImageLib/X86QemuLoadImageLib.c | 3 +++
2 files changed, 6 insertions(+)
(1) The bugzilla ticket should be referenced in the commit message,
above your signoff:

Ref: https://bugzilla.tianocore.org/show_bug.cgi?id=3457

With that update:

Reviewed-by: Laszlo Ersek <lersek@redhat.com>

Thanks,
Laszlo


diff --git a/OvmfPkg/Library/X86QemuLoadImageLib/X86QemuLoadImageLib.inf b/OvmfPkg/Library/X86QemuLoadImageLib/X86QemuLoadImageLib.inf
index e1615badd2ba..c7ec041cb706 100644
--- a/OvmfPkg/Library/X86QemuLoadImageLib/X86QemuLoadImageLib.inf
+++ b/OvmfPkg/Library/X86QemuLoadImageLib/X86QemuLoadImageLib.inf
@@ -2,6 +2,9 @@
# X86 specific implementation of QemuLoadImageLib library class interface
# with support for loading mixed mode images and non-EFI stub images
#
+# Note that this implementation reads the cmdline (and possibly kernel, setup
+# data, and initrd in the legacy boot mode) from fw_cfg directly.
+#
# Copyright (c) 2020, ARM Ltd. All rights reserved.<BR>
#
# SPDX-License-Identifier: BSD-2-Clause-Patent
diff --git a/OvmfPkg/Library/X86QemuLoadImageLib/X86QemuLoadImageLib.c b/OvmfPkg/Library/X86QemuLoadImageLib/X86QemuLoadImageLib.c
index 1177582ab051..dc9018f4333b 100644
--- a/OvmfPkg/Library/X86QemuLoadImageLib/X86QemuLoadImageLib.c
+++ b/OvmfPkg/Library/X86QemuLoadImageLib/X86QemuLoadImageLib.c
@@ -2,6 +2,9 @@
X86 specific implementation of QemuLoadImageLib library class interface
with support for loading mixed mode images and non-EFI stub images

+ Note that this implementation reads the cmdline (and possibly kernel, setup
+ data, and initrd in the legacy boot mode) from fw_cfg directly.
+
Copyright (c) 2006 - 2015, Intel Corporation. All rights reserved.<BR>
Copyright (c) 2020, ARM Ltd. All rights reserved.<BR>


Re: [PATCH v2 2/3] OvmfPkg/GenericQemuLoadImageLib: Read cmdline from QemuKernelLoaderFs

Laszlo Ersek
 

Hi Dov,

On 06/17/21 14:16, Dov Murik wrote:
Remove the QemuFwCfgLib interface used to read the QEMU cmdline
(-append argument) and the initrd size. Instead, use the synthetic
filesystem QemuKernelLoaderFs which has three files: "kernel", "initrd",
and "cmdline".

Cc: Laszlo Ersek <lersek@redhat.com>
Cc: Ard Biesheuvel <ardb+tianocore@kernel.org>
Cc: Jordan Justen <jordan.l.justen@intel.com>
Cc: James Bottomley <jejb@linux.ibm.com>
Cc: Tobin Feldman-Fitzthum <tobin@linux.ibm.com>
Signed-off-by: Dov Murik <dovmurik@linux.ibm.com>
---
OvmfPkg/Library/GenericQemuLoadImageLib/GenericQemuLoadImageLib.inf | 2 +-
OvmfPkg/Library/GenericQemuLoadImageLib/GenericQemuLoadImageLib.c | 145 ++++++++++++++++++--
2 files changed, 133 insertions(+), 14 deletions(-)
This update seems to address everything that Ard requested under v1;
thanks.

My comments:

(1) I spent a lot of time reviewing your patch. Unfortunately, I found a
preexistent bug in both QemuLoadImageLib instances, which we should fix
first, in two separate patches.

The bug was introduced in commit ddd2be6b0026 ("OvmfPkg: provide a
generic implementation of QemuLoadImageLib", 2020-03-05). Unfortunately
I missed the bug in my original review.

In said commit, the QemuLoadKernelImage() function
[OvmfPkg/Library/GenericQemuLoadImageLib/GenericQemuLoadImageLib.c]
refactored / reimplemented the logic from the TryRunningQemuKernel()
function [ArmVirtPkg/Library/PlatformBootManagerLib/QemuKernel.c].

If we now check out the tree at ddd2be6b0026, and compare the above two
functions, we notice the following:

(1a) TryRunningQemuKernel() downloads all three blobs via fw_cfg in the
beginning, and *always* frees all successfully downloaded blobs at the
end, under the "FreeBlobs" label.

(1b) In QemuLoadKernelImage(), the kernel and initrd fw_cfg blobs are
owned by the QemuKernelLoaderFsDxe driver; only the command line blob is
downloaded from fw_cfg. Not freeing the former two blobs (kernel and
initrd) makes sense. *However*, the command line blob should *still* be
freed, even if QemuLoadKernelImage() succeeds! That's because we have no
use for the command line fw_cfg blob, after it is translated to
LoadOptions.

The bug is that QemuLoadKernelImage() leaks "CommandLine" on success.


The same issue was introduced in the other lib instance
[OvmfPkg/Library/X86QemuLoadImageLib/X86QemuLoadImageLib.c], in commit
7c47d89003a6 ("OvmfPkg: implement QEMU loader library for X86 with
legacy fallback", 2020-03-05).


The fix is identical between both library instances:

@@ -193,14 +193,16 @@ QemuLoadKernelImage (
}

*ImageHandle = KernelImageHandle;
- return EFI_SUCCESS;
+ Status = EFI_SUCCESS;

FreeCommandLine:
if (CommandLineSize > 0) {
FreePool (CommandLine);
}
UnloadImage:
- gBS->UnloadImage (KernelImageHandle);
+ if (EFI_ERROR (Status)) {
+ gBS->UnloadImage (KernelImageHandle);
+ }

return Status;
}
Can you please submit this fix twice, in two separate patches at the
*very front* of this series, one patch for each lib instance? Something
like:

#1 OvmfPkg/GenericQemuLoadImageLib: plug cmdline blob leak on success
...
Reported-by: Laszlo Ersek <lersek@redhat.com>
Fixes: ddd2be6b0026abcd0f819b3915fc80c3de81dd62

#2 OvmfPkg/X86QemuLoadImageLib: plug cmdline blob leak on success
...
Reported-by: Laszlo Ersek <lersek@redhat.com>
Fixes: 7c47d89003a6f8f7f6f0ce8ca7d3e87c630d14cc

Thank you in advance!

Then, comments on your actual patch:

(2) The bugzilla ticket should be referenced in the commit message
please, above your signoff:

Ref: https://bugzilla.tianocore.org/show_bug.cgi?id=3457


On 06/17/21 14:16, Dov Murik wrote:

diff --git a/OvmfPkg/Library/GenericQemuLoadImageLib/GenericQemuLoadImageLib.inf b/OvmfPkg/Library/GenericQemuLoadImageLib/GenericQemuLoadImageLib.inf
index b262cb926a4d..f462fd6922cf 100644
--- a/OvmfPkg/Library/GenericQemuLoadImageLib/GenericQemuLoadImageLib.inf
+++ b/OvmfPkg/Library/GenericQemuLoadImageLib/GenericQemuLoadImageLib.inf
@@ -27,12 +27,12 @@ [LibraryClasses]
DebugLib
MemoryAllocationLib
PrintLib
- QemuFwCfgLib
UefiBootServicesTableLib

[Protocols]
gEfiDevicePathProtocolGuid
gEfiLoadedImageProtocolGuid
+ gEfiSimpleFileSystemProtocolGuid

[Guids]
gQemuKernelLoaderFsMediaGuid
(3) The FileHandleLib class should be added, under [LibraryClasses].


diff --git a/OvmfPkg/Library/GenericQemuLoadImageLib/GenericQemuLoadImageLib.c b/OvmfPkg/Library/GenericQemuLoadImageLib/GenericQemuLoadImageLib.c
index 114db7e8441f..f520456e3b24 100644
--- a/OvmfPkg/Library/GenericQemuLoadImageLib/GenericQemuLoadImageLib.c
+++ b/OvmfPkg/Library/GenericQemuLoadImageLib/GenericQemuLoadImageLib.c
@@ -11,9 +11,9 @@
#include <Base.h>
#include <Guid/QemuKernelLoaderFsMedia.h>
#include <Library/DebugLib.h>
+#include <Library/FileHandleLib.h>
#include <Library/MemoryAllocationLib.h>
#include <Library/PrintLib.h>
-#include <Library/QemuFwCfgLib.h>
#include <Library/QemuLoadImageLib.h>
#include <Library/UefiBootServicesTableLib.h>
#include <Protocol/DevicePath.h>
(4) The new "gEfiSimpleFileSystemProtocolGuid" dependency should be
reflected here too, by adding:

#include <Protocol/SimpleFileSystem.h>

(In general the [Protocols] section of the INF file should be matched by
#include <Protocol/...> directives.)

This was masked from you because <Library/FileHandleLib.h> pulled in
<Protocol/SimpleFileSystem.h>, but that's not enough justification for a
difference between the INF [Protocols] section and the #include
directive list.


@@ -30,6 +30,11 @@ typedef struct {
KERNEL_FILE_DEVPATH FileNode;
EFI_DEVICE_PATH_PROTOCOL EndNode;
} KERNEL_VENMEDIA_FILE_DEVPATH;
+
+typedef struct {
+ VENDOR_DEVICE_PATH VenMediaNode;
+ EFI_DEVICE_PATH_PROTOCOL EndNode;
+} SINGLE_VENMEDIA_NODE_DEVPATH;
#pragma pack ()

STATIC CONST KERNEL_VENMEDIA_FILE_DEVPATH mKernelDevicePath = {
@@ -51,6 +56,78 @@ STATIC CONST KERNEL_VENMEDIA_FILE_DEVPATH mKernelDevicePath = {
}
};

+STATIC CONST SINGLE_VENMEDIA_NODE_DEVPATH mQemuKernelLoaderFileSystemDevicePath = {
(5) This variable name causes two overlong lines in the file; it should
be renamed to "mQemuKernelLoaderFsDevicePath" please.


+ {
+ {
+ MEDIA_DEVICE_PATH, MEDIA_VENDOR_DP,
+ { sizeof (VENDOR_DEVICE_PATH) }
+ },
+ QEMU_KERNEL_LOADER_FS_MEDIA_GUID
+ }, {
+ END_DEVICE_PATH_TYPE, END_ENTIRE_DEVICE_PATH_SUBTYPE,
+ { sizeof (EFI_DEVICE_PATH_PROTOCOL) }
+ }
+};
+
+STATIC
+EFI_STATUS
+GetQemuKernelLoaderBlobSize (
+ IN EFI_FILE_HANDLE Root,
+ IN CHAR16 *FileName,
+ OUT UINTN *Size
+ )
+{
+ EFI_STATUS Status;
+ EFI_FILE_HANDLE FileHandle;
+ UINT64 FileSize;
+
+ Status = Root->Open (Root, &FileHandle, FileName, EFI_FILE_MODE_READ, 0);
+ if (EFI_ERROR (Status)) {
+ return Status;
+ }
+ Status = FileHandleGetSize (FileHandle, &FileSize);
+ if (EFI_ERROR (Status)) {
+ goto CloseFile;
+ }
+ *Size = FileSize;
(6) Silent truncation from UINT64 to UINTN, even if theoretical, is bad
practice. Please do this:

if (FileSize > MAX_UINTN) {
Status = EFI_UNSUPPORTED;
goto CloseFile;
}
*Size = (UINTN)FileSize;


+ Status = EFI_SUCCESS;
+CloseFile:
+ FileHandle->Close (FileHandle);
+ return Status;
+}
+
+STATIC
+EFI_STATUS
+ReadWholeQemuKernelLoaderBlob (
+ IN EFI_FILE_HANDLE Root,
+ IN CHAR16 *FileName,
+ IN UINTN Size,
+ OUT VOID *Buffer
+ )
+{
+ EFI_STATUS Status;
+ EFI_FILE_HANDLE FileHandle;
+ UINTN ReadSize;
+
+ Status = Root->Open (Root, &FileHandle, FileName, EFI_FILE_MODE_READ, 0);
+ if (EFI_ERROR (Status)) {
+ return Status;
+ }
+ ReadSize = Size;
+ Status = FileHandle->Read (FileHandle, &ReadSize, Buffer);
+ if (EFI_ERROR (Status)) {
+ goto CloseFile;
+ }
+ if (ReadSize != Size) {
+ Status = EFI_PROTOCOL_ERROR;
+ goto CloseFile;
+ }
+ Status = EFI_SUCCESS;
+CloseFile:
+ FileHandle->Close (FileHandle);
+ return Status;
+}
+
/**
Download the kernel, the initial ramdisk, and the kernel command line from
QEMU's fw_cfg. The kernel will be instructed via its command line to load
@@ -76,12 +153,16 @@ QemuLoadKernelImage (
OUT EFI_HANDLE *ImageHandle
)
{
- EFI_STATUS Status;
- EFI_HANDLE KernelImageHandle;
- EFI_LOADED_IMAGE_PROTOCOL *KernelLoadedImage;
- UINTN CommandLineSize;
- CHAR8 *CommandLine;
- UINTN InitrdSize;
+ EFI_STATUS Status;
+ EFI_HANDLE KernelImageHandle;
+ EFI_LOADED_IMAGE_PROTOCOL *KernelLoadedImage;
+ EFI_DEVICE_PATH_PROTOCOL *DevicePathNode;
+ EFI_HANDLE FsVolumeHandle;
+ EFI_SIMPLE_FILE_SYSTEM_PROTOCOL *FsProtocol;
+ EFI_FILE_HANDLE Root;
+ UINTN CommandLineSize;
+ CHAR8 *CommandLine;
+ UINTN InitrdSize;

//
// Load the image. This should call back into the QEMU EFI loader file system.
@@ -124,8 +205,38 @@ QemuLoadKernelImage (
);
ASSERT_EFI_ERROR (Status);

- QemuFwCfgSelectItem (QemuFwCfgItemCommandLineSize);
- CommandLineSize = (UINTN)QemuFwCfgRead32 ();
+ //
+ // Open the Qemu Kernel Loader abstract filesystem (volume) which will be
+ // used to read the "initrd" and "cmdline" synthetic files.
+ //
(7) This comment is welcome, but it is inexact.

We'll use the filesystem for reading the command line, yes, but
regarding the initrd, we use the filesystem only for learning the *size*
of the initrd. (And even the size of the initrd is only interesting
inasmuch a nonzero size means that an initrd is *present*.) The initrd
blob itself is not read by us.

I suggest:

used to query the "initrd" and to read the "cmdline" synthetic files.


+ DevicePathNode = (EFI_DEVICE_PATH_PROTOCOL *)&mQemuKernelLoaderFileSystemDevicePath;
+ Status = gBS->LocateDevicePath (
+ &gEfiSimpleFileSystemProtocolGuid,
+ &DevicePathNode,
+ &FsVolumeHandle
+ );
+ if (EFI_ERROR (Status)) {
+ return Status;
(8) This leaks "KernelImageHandle". At this point, gBS->LoadImage() at
the top of the function will have succeeded.

Please jump to the UnloadImage label, rather than returning.


+ }
+
+ Status = gBS->HandleProtocol (
+ FsVolumeHandle,
+ &gEfiSimpleFileSystemProtocolGuid,
+ (VOID **)&FsProtocol
+ );
+ if (EFI_ERROR (Status)) {
+ return Status;
(9) Same leak as described in (8); please jump to the UnloadImage label.


+ }
+
+ Status = FsProtocol->OpenVolume (FsVolumeHandle, &Root);
+ if (EFI_ERROR (Status)) {
+ return Status;
(10) Same leak as described in (8); please jump to the UnloadImage
label.


+ }
+
+ Status = GetQemuKernelLoaderBlobSize (Root, L"cmdline", &CommandLineSize);
+ if (EFI_ERROR (Status)) {
+ goto CloseRoot;
+ }

if (CommandLineSize == 0) {
KernelLoadedImage->LoadOptionsSize = 0;
@@ -136,8 +247,11 @@ QemuLoadKernelImage (
goto UnloadImage;
}
(11) Not fully shown in the context, but here we have:

if (CommandLineSize == 0) {
KernelLoadedImage->LoadOptionsSize = 0;
} else {
CommandLine = AllocatePool (CommandLineSize);
if (CommandLine == NULL) {
Status = EFI_OUT_OF_RESOURCES;
goto UnloadImage;
}

Note that we have a "goto UnloadImage" in it.

Please update that to "goto CloseRoot".



- QemuFwCfgSelectItem (QemuFwCfgItemCommandLineData);
- QemuFwCfgReadBytes (CommandLineSize, CommandLine);
+ Status = ReadWholeQemuKernelLoaderBlob (Root, L"cmdline", CommandLineSize,
+ CommandLine);
+ if (EFI_ERROR (Status)) {
+ goto FreeCommandLine;
+ }

//
// Verify NUL-termination of the command line.
@@ -155,8 +269,10 @@ QemuLoadKernelImage (
KernelLoadedImage->LoadOptionsSize = (UINT32)((CommandLineSize - 1) * 2);
}

- QemuFwCfgSelectItem (QemuFwCfgItemInitrdSize);
- InitrdSize = (UINTN)QemuFwCfgRead32 ();
+ Status = GetQemuKernelLoaderBlobSize (Root, L"initrd", &InitrdSize);
+ if (EFI_ERROR (Status)) {
+ goto FreeCommandLine;
+ }

if (InitrdSize > 0) {
//
@@ -193,6 +309,7 @@ QemuLoadKernelImage (
}

*ImageHandle = KernelImageHandle;
+ Root->Close (Root);
return EFI_SUCCESS;

FreeCommandLine:
@@ -201,6 +318,8 @@ FreeCommandLine:
}
UnloadImage:
gBS->UnloadImage (KernelImageHandle);
+CloseRoot:
+ Root->Close (Root);

return Status;
}
(12) So, the order of handlers is incorrect here, and when I looked into
it, that was when I actually found preexistent issue (1).

The desired epilogue for the function is:

*ImageHandle = KernelImageHandle;
Status = EFI_SUCCESS;

FreeCommandLine:
if (CommandLineSize > 0) {
FreePool (CommandLine);
}
CloseRoot:
Root->Close (Root);
UnloadImage:
if (EFI_ERROR (Status)) {
gBS->UnloadImage (KernelImageHandle);
}

return Status;
The idea is that CommandLine and Root are both temporaries, and as such
they need to be released on either success or failure. Whereas
KernelImageHandle must be released precisely on failure. Furthermore, in
either case, they must cascade as shown above -- in reverse order of
construction.

Thanks!
Laszlo


Re: [PATCH v4 4/4] OvmfPkg/PlatformDxe: Add support for SEV live migration.

Ashish Kalra
 

Hello Tom,

On Tue, Jun 22, 2021 at 06:06:24PM -0500, Tom Lendacky wrote:
+
+/**
+ Figures out if we are running inside KVM HVM and
+ KVM HVM supports SEV Live Migration feature.
+
+ @retval TRUE KVM was detected and Live Migration supported
+ @retval FALSE KVM was not detected or Live Migration not supported
+
+**/
+BOOLEAN
+KvmDetectSevLiveMigrationFeature(
+ VOID
+ )
+{
+ UINT8 Signature[13];
+ UINT32 mKvmLeaf = 0;
+ UINT32 RegEax, RegEbx, RegEcx, RegEdx;
+
+ Signature[12] = '\0';
+ for (mKvmLeaf = 0x40000000; mKvmLeaf < 0x40010000; mKvmLeaf += 0x100) {
What's the reason for the loop? I would think that just checking
0x40000000 would be enough, so a comment seems to be warranted.
0x40000000 leaf is the hypervisor CPUID information leaf, so probably
just checking 0x40000000 should be enough.

But i see that other hypervisor detection functions like XenDetect()
do a loop test on the hypervisor existence function until the
signature match, is there a specific reason for that ?

Is this for some kind of support for another/multiple hypervisors ?

+ AsmCpuid (mKvmLeaf,
+ NULL,
+ (UINT32 *) &Signature[0],
+ (UINT32 *) &Signature[4],
+ (UINT32 *) &Signature[8]);
+
+ if (!AsciiStrCmp ((CHAR8 *) Signature, "KVMKVMKVM\0\0\0")) {
+ DEBUG ((
+ DEBUG_ERROR,
DEBUG_INFO, it doesn't seem like an error.
Ok.

+ "%a: KVM Detected, signature = %s\n",
+ __FUNCTION__,
+ Signature
+ ));
+> + RegEax = 0x40000001;
Should this be mKvmLeaf + 1? It is confusing that you may check 0x40000100
and then not do 0x40000101.
Yes, it should be mKvmLeaf + 1, assuming the loop above is being used.

+ RegEcx = 0;
+ AsmCpuid (0x40000001, &RegEax, &RegEbx, &RegEcx, &RegEdx);
+ if (RegEax & (1 << KVM_FEATURE_MIGRATION_CONTROL)) {
Thanks,
Ashish


Re: [PATCH v1 0/5] EDK2 Code First: PI Specification: Update EFI_MM_COMMUNICATE_HEADER

Michael D Kinney
 

Hello,

Flexible array members are supported and should be used. The old style
of adding an array of size [1] at the end of a structure was used at a time
flexible array members were not supported by all compilers (late 1990's).
The workarounds used to handle the array of size [1] are very confusing when
reading the C code and the fact that sizeof() does not produce the expected
result make it even worse.

If we use flexible array members in this proposed change then there is
no need to use OFFSET_OF(). Correct?

Mike

-----Original Message-----
From: Marvin Häuser <mhaeuser@posteo.de>
Sent: Thursday, June 24, 2021 1:00 AM
To: Kun Qin <kuqin12@gmail.com>; Laszlo Ersek <lersek@redhat.com>; devel@edk2.groups.io
Cc: Wang, Jian J <jian.j.wang@intel.com>; Wu, Hao A <hao.a.wu@intel.com>; Dong, Eric <eric.dong@intel.com>; Ni, Ray
<ray.ni@intel.com>; Kinney, Michael D <michael.d.kinney@intel.com>; Liming Gao <gaoliming@byosoft.com.cn>; Liu, Zhiguang
<zhiguang.liu@intel.com>; Andrew Fish <afish@apple.com>; Leif Lindholm <leif@nuviainc.com>; Bret Barkelew
<Bret.Barkelew@microsoft.com>; michael.kubacki@microsoft.com
Subject: Re: [edk2-devel] [PATCH v1 0/5] EDK2 Code First: PI Specification: Update EFI_MM_COMMUNICATE_HEADER

Hey Kun,

Why would you rely on undefined behaviours? The OFFSET_OF macro is
well-defined for GCC and Clang as it's implemented by an intrinsic, and
while the expression for the MSVC compiler is undefined behaviour as per
the C standard, it is well-defined for MSVC due to their own
implementation being identical. From my standpoint, all supported
compilers will yield well-defined behaviour even this way. OFFSET_OF on
flexible arrays is not UB in any case to my knowledge.

However, the same way as your new suggestion, you can replace OFFSET_OF
with sizeof. While this *can* lead to wasted space with certain
structure layouts (e.g. when the flexible array overlays padding bytes),
this is not the case here, and otherwise just loses you a few bytes. I
think this comes down to preference.

The pattern you mentioned arguably is less nice syntax when used
(involves address calculation and casting), but the biggest problem here
is alignment constraints. For packed structures, you lose the ability of
automatic unaligned accesses (irrelevant here because the structure is
manually padded anyway). For non-packed structures, you still need to
ensure the alignment requirement of the trailing array data is met
manually. With flexible array members, the compiler takes care of both
cases automatically.

Best regards,
Marvin

On 24.06.21 02:24, Kun Qin wrote:
Hi Marvin,

I would prefer not to rely on undefined behaviors from different
compilers. Instead of using flexible arrays, is it better to remove
the `Data` field, pack the structure and follow
"VARIABLE_LOCK_ON_VAR_STATE_POLICY" pattern?

In that case, OFFSET_OF will be forced to change to sizeof, and
read/write to `Data` will follow the range indicated by MessageLength.
But yes, that will enforce developers to update their platform level
implementations accordingly.

Regards,
Kun

On 06/23/2021 08:26, Laszlo Ersek wrote:
On 06/23/21 08:54, Marvin Häuser wrote:
On 22.06.21 17:34, Laszlo Ersek wrote:
On 06/18/21 11:37, Marvin Häuser wrote:
On 16.06.21 22:58, Kun Qin wrote:
On 06/16/2021 00:02, Marvin Häuser wrote:
2) Is it feasible yet with the current set of supported
compilers to
support flexible arrays?
My impression is that flexible arrays are already supported (as seen
in UnitTestFrameworkPkg/PrivateInclude/UnitTestFrameworkTypes.h).
Please correct me if I am wrong.

Would you mind letting me know why this is applicable here? We are
trying to seek ideas on how to catch developer mistakes caused by
this
change. So any input is appreciated.
Huh, interesting. Last time I tried I was told about
incompatibilities
with MSVC, but I know some have been dropped since then (2005 and
2008
if I recall correctly?), so that'd be great to allow globally.
I too am surprised to see
"UnitTestFrameworkPkg/PrivateInclude/UnitTestFrameworkTypes.h". The
flexible array member is a C99 feature, and I didn't even know that we
disallowed it for the sake of particular VS toolchains -- I thought we
had a more general reason than just "not supported by VS versions X
and Y".

The behavior of OFFSET_OF() would be interesting -- the OFFSET_OF()
macro definition for non-gcc / non-clang:

#define OFFSET_OF(TYPE, Field) ((UINTN) &(((TYPE *)0)->Field))

borders on undefined behavior as far as I can tell, so its behavior is
totally up to the compiler. It works thus far okay on Visual
Studio, but
I couldn't say if it extended correctly to flexible array members.
Yes, it's UB by the standard, but this is actually how MS implements
them (or used to anyway?). I don't see why it'd cause issues with
flexible arrays, as only the start of the array is relevant (which is
constant for all instances of the structure no matter the amount of
elements actually stored). Any specific concern? If so, they could be
addressed by appropriate STATIC_ASSERTs.
No specific concern; my point was that two aspects of the same "class"
of undefined behavior didn't need to be consistent with each other.

Thanks
Laszlo


[PATCH v2] NetworkPkg:UEFIPXEBC

INDIA\sivaramann <emergingsiva@...>
 

Issue on the PxeBcDhcp4CallBack() functions of UEFIPXEBC Driver.
In this function any non allowed events are recieved as input it
will exit in beginning itself. But the switch case handling the
default and Dhcp4SendRequest which is not reachable.

Signed-off-by: Sivaraman <sivaramann@ami.com>
---
NetworkPkg/UefiPxeBcDxe/PxeBcDhcp4.c | 12 ++----------
1 file changed, 2 insertions(+), 10 deletions(-)

diff --git a/NetworkPkg/UefiPxeBcDxe/PxeBcDhcp4.c b/NetworkPkg/UefiPxeBcDxe=
/PxeBcDhcp4.c
index fb63cf61a9..e85176f9bb 100644
--- a/NetworkPkg/UefiPxeBcDxe/PxeBcDhcp4.c
+++ b/NetworkPkg/UefiPxeBcDxe/PxeBcDhcp4.c
@@ -1256,19 +1256,10 @@ PxeBcDhcp4CallBack (
=0D
//=0D
// Cache the DHCPv4 discover packet to mode data directly.=0D
- // It need to check SendGuid as well as Dhcp4SendRequest.=0D
+ // It need to check SendGuid.=0D
//=0D
CopyMem (&Mode->DhcpDiscover.Dhcpv4, &Packet->Dhcp4, Packet->Length);=
=0D
=0D
- case Dhcp4SendRequest:=0D
- if (Packet->Length > PXEBC_DHCP4_PACKET_MAX_SIZE) {=0D
- //=0D
- // If the to be sent packet exceeds the maximum length, abort the DH=
CP process.=0D
- //=0D
- Status =3D EFI_ABORTED;=0D
- break;=0D
- }=0D
-=0D
if (Mode->SendGUID) {=0D
//=0D
// Send the system Guid instead of the MAC address as the hardware a=
ddress if required.=0D
@@ -1332,6 +1323,7 @@ PxeBcDhcp4CallBack (
break;=0D
=0D
default:=0D
+ ASSERT (FALSE);=0D
break;=0D
}=0D
=0D
--=20
2.28.0.windows.1


Re: [PATCH v2 1/3] Revert "OvmfPkg/QemuKernelLoaderFsDxe: don't expose kernel command line"

Laszlo Ersek
 

On 06/17/21 14:16, Dov Murik wrote:
This reverts commit efc52d67e1573ce174d301b52fa1577d552c8441.

Manually fixed conflicts in:
OvmfPkg/QemuKernelLoaderFsDxe/QemuKernelLoaderFsDxe.c

Note that besides re-exposing the kernel command line as a file in the
synthetic filesystem, we also revert back to AllocatePool instead of
AllocatePages.

Cc: Laszlo Ersek <lersek@redhat.com>
Cc: Ard Biesheuvel <ardb+tianocore@kernel.org>
Cc: Jordan Justen <jordan.l.justen@intel.com>
Cc: James Bottomley <jejb@linux.ibm.com>
Cc: Tobin Feldman-Fitzthum <tobin@linux.ibm.com>
Signed-off-by: Dov Murik <dovmurik@linux.ibm.com>
---
OvmfPkg/QemuKernelLoaderFsDxe/QemuKernelLoaderFsDxe.c | 11 ++++++++---
1 file changed, 8 insertions(+), 3 deletions(-)
(1) The bugzilla ticket should be referenced in the commit message,
above your signoff:

Ref: https://bugzilla.tianocore.org/show_bug.cgi?id=3457

With that update:

Reviewed-by: Laszlo Ersek <lersek@redhat.com>

Thanks,
Laszlo


diff --git a/OvmfPkg/QemuKernelLoaderFsDxe/QemuKernelLoaderFsDxe.c b/OvmfPkg/QemuKernelLoaderFsDxe/QemuKernelLoaderFsDxe.c
index b09ff6a3590d..c7ddd86f5c75 100644
--- a/OvmfPkg/QemuKernelLoaderFsDxe/QemuKernelLoaderFsDxe.c
+++ b/OvmfPkg/QemuKernelLoaderFsDxe/QemuKernelLoaderFsDxe.c
@@ -33,6 +33,7 @@
typedef enum {
KernelBlobTypeKernel,
KernelBlobTypeInitrd,
+ KernelBlobTypeCommandLine,
KernelBlobTypeMax
} KERNEL_BLOB_TYPE;

@@ -59,6 +60,11 @@ STATIC KERNEL_BLOB mKernelBlob[KernelBlobTypeMax] = {
{
{ QemuFwCfgItemInitrdSize, QemuFwCfgItemInitrdData, },
}
+ }, {
+ L"cmdline",
+ {
+ { QemuFwCfgItemCommandLineSize, QemuFwCfgItemCommandLineData, },
+ }
}
};

@@ -948,7 +954,7 @@ FetchBlob (
//
// Read blob.
//
- Blob->Data = AllocatePages (EFI_SIZE_TO_PAGES ((UINTN)Blob->Size));
+ Blob->Data = AllocatePool (Blob->Size);
if (Blob->Data == NULL) {
DEBUG ((DEBUG_ERROR, "%a: failed to allocate %Ld bytes for \"%s\"\n",
__FUNCTION__, (INT64)Blob->Size, Blob->Name));
@@ -1083,8 +1089,7 @@ FreeBlobs:
while (BlobType > 0) {
CurrentBlob = &mKernelBlob[--BlobType];
if (CurrentBlob->Data != NULL) {
- FreePages (CurrentBlob->Data,
- EFI_SIZE_TO_PAGES ((UINTN)CurrentBlob->Size));
+ FreePool (CurrentBlob->Data);
CurrentBlob->Size = 0;
CurrentBlob->Data = NULL;
}


[PATCH v4] BaseTools GenFw: Add support for RISCV GOT/PLT relocations

Sunil V L
 

Ref: https://bugzilla.tianocore.org/show_bug.cgi?id=3D3096

This patch adds support for R_RISCV_CALL_PLT and R_RISCV_GOT_HI20
relocations generated by PIE enabled compiler. This also needed
changes to R_RISCV_32 and R_RISCV_64 relocations as explained in
https://github.com/riscv/riscv-gnu-toolchain/issues/905#issuecomment-846682=
710

Changes in v4:
- Fixed the typecast issue found by VS2019.

Changes in v3:
- Added the comments to address Liming's feedback.

Changes in v2:
- Addressed Daniel's comment on formatting

Testing:
1) Debian GCC 8.3.0 and booted sifive_u and QMEU virt models.
2) Debian 10.2.0 and booted QEMU virt model.
3) riscv-gnu-tool chain 9.2 and booted QEMU virt model.

Signed-off-by: Sunil V L <sunilvl@ventanamicro.com>

Acked-by: Abner Chang <abner.chang@hpe.com>
Reviewed-by: Daniel Schaefer <daniel.schaefer@hpe.com>
Tested-by: Daniel Schaefer <daniel.schaefer@hpe.com>

Cc: Bob Feng <bob.c.feng@intel.com>
Cc: Liming Gao <gaoliming@byosoft.com.cn>
Cc: Yuwei Chen <yuwei.chen@intel.com>
Cc: Heinrich Schuchardt <xypron.glpk@gmx.de>
---
BaseTools/Source/C/GenFw/Elf64Convert.c | 59 ++++++++++++++++++++++---
1 file changed, 53 insertions(+), 6 deletions(-)

diff --git a/BaseTools/Source/C/GenFw/Elf64Convert.c b/BaseTools/Source/C/G=
enFw/Elf64Convert.c
index d097db8632..f86be95fbb 100644
--- a/BaseTools/Source/C/GenFw/Elf64Convert.c
+++ b/BaseTools/Source/C/GenFw/Elf64Convert.c
@@ -129,6 +129,8 @@ STATIC UINT32 mDebugOffset;
STATIC UINT8 *mRiscVPass1Targ =3D NULL;=0D
STATIC Elf_Shdr *mRiscVPass1Sym =3D NULL;=0D
STATIC Elf64_Half mRiscVPass1SymSecIndex =3D 0;=0D
+STATIC INT32 mRiscVPass1Offset;=0D
+STATIC INT32 mRiscVPass1GotFixup;=0D
=0D
//=0D
// Initialization Function=0D
@@ -473,17 +475,18 @@ WriteSectionRiscV64 (
{=0D
UINT32 Value;=0D
UINT32 Value2;=0D
+ Elf64_Addr GOTEntryRva;=0D
=0D
switch (ELF_R_TYPE(Rel->r_info)) {=0D
case R_RISCV_NONE:=0D
break;=0D
=0D
case R_RISCV_32:=0D
- *(UINT32 *)Targ =3D (UINT32)((UINT64)(*(UINT32 *)Targ) - SymShdr->sh_a=
ddr + mCoffSectionsOffset[Sym->st_shndx]);=0D
+ *(UINT64 *)Targ =3D Sym->st_value + Rel->r_addend;=0D
break;=0D
=0D
case R_RISCV_64:=0D
- *(UINT64 *)Targ =3D *(UINT64 *)Targ - SymShdr->sh_addr + mCoffSections=
Offset[Sym->st_shndx];=0D
+ *(UINT64 *)Targ =3D Sym->st_value + Rel->r_addend;=0D
break;=0D
=0D
case R_RISCV_HI20:=0D
@@ -533,6 +536,18 @@ WriteSectionRiscV64 (
mRiscVPass1SymSecIndex =3D 0;=0D
break;=0D
=0D
+ case R_RISCV_GOT_HI20:=0D
+ GOTEntryRva =3D (Sym->st_value - Rel->r_offset);=0D
+ mRiscVPass1Offset =3D RV_X(GOTEntryRva, 0, 12);=0D
+ Value =3D (UINT32)RV_X(GOTEntryRva, 12, 20);=0D
+ *(UINT32 *)Targ =3D (Value << 12) | (RV_X(*(UINT32*)Targ, 0, 12));=0D
+=0D
+ mRiscVPass1Targ =3D Targ;=0D
+ mRiscVPass1Sym =3D SymShdr;=0D
+ mRiscVPass1SymSecIndex =3D Sym->st_shndx;=0D
+ mRiscVPass1GotFixup =3D 1;=0D
+ break;=0D
+=0D
case R_RISCV_PCREL_HI20:=0D
mRiscVPass1Targ =3D Targ;=0D
mRiscVPass1Sym =3D SymShdr;=0D
@@ -545,11 +560,17 @@ WriteSectionRiscV64 (
if (mRiscVPass1Targ !=3D NULL && mRiscVPass1Sym !=3D NULL && mRiscVPas=
s1SymSecIndex !=3D 0) {=0D
int i;=0D
Value2 =3D (UINT32)(RV_X(*(UINT32 *)mRiscVPass1Targ, 12, 20));=0D
- Value =3D (UINT32)(RV_X(*(UINT32 *)Targ, 20, 12));=0D
- if(Value & (RISCV_IMM_REACH/2)) {=0D
- Value |=3D ~(RISCV_IMM_REACH-1);=0D
+=0D
+ if(mRiscVPass1GotFixup) {=0D
+ Value =3D (UINT32)(mRiscVPass1Offset);=0D
+ } else {=0D
+ Value =3D (UINT32)(RV_X(*(UINT32 *)Targ, 20, 12));=0D
+ if(Value & (RISCV_IMM_REACH/2)) {=0D
+ Value |=3D ~(RISCV_IMM_REACH-1);=0D
+ }=0D
}=0D
Value =3D Value - (UINT32)mRiscVPass1Sym->sh_addr + mCoffSectionsOff=
set[mRiscVPass1SymSecIndex];=0D
+=0D
if(-2048 > (INT32)Value) {=0D
i =3D (((INT32)Value * -1) / 4096);=0D
Value2 -=3D i;=0D
@@ -569,12 +590,35 @@ WriteSectionRiscV64 (
}=0D
}=0D
=0D
- *(UINT32 *)Targ =3D (RV_X(Value, 0, 12) << 20) | (RV_X(*(UINT32*)Tar=
g, 0, 20));=0D
+ if(mRiscVPass1GotFixup) {=0D
+ *(UINT32 *)Targ =3D (RV_X((UINT32)Value, 0, 12) << 20)=0D
+ | (RV_X(*(UINT32*)Targ, 0, 20));=0D
+ // Convert LD instruction to ADDI=0D
+ //=0D
+ // |31 20|19 15|14 12|11 7|6 0|=0D
+ // |-----------------------------------------|=0D
+ // |imm[11:0] | rs1 | 011 | rd | 0000011 | LD=0D
+ // -----------------------------------------=0D
+=0D
+ // |-----------------------------------------|=0D
+ // |imm[11:0] | rs1 | 000 | rd | 0010011 | ADDI=0D
+ // -----------------------------------------=0D
+=0D
+ // To convert, let's first reset bits 12-14 and 0-6 using ~0x707f=
=0D
+ // Then modify the opcode to ADDI (0010011)=0D
+ // All other fields will remain same.=0D
+=0D
+ *(UINT32 *)Targ =3D ((*(UINT32 *)Targ & ~0x707f) | 0x13);=0D
+ } else {=0D
+ *(UINT32 *)Targ =3D (RV_X(Value, 0, 12) << 20) | (RV_X(*(UINT32*)T=
arg, 0, 20));=0D
+ }=0D
*(UINT32 *)mRiscVPass1Targ =3D (RV_X(Value2, 0, 20)<<12) | (RV_X(*(U=
INT32 *)mRiscVPass1Targ, 0, 12));=0D
}=0D
mRiscVPass1Sym =3D NULL;=0D
mRiscVPass1Targ =3D NULL;=0D
mRiscVPass1SymSecIndex =3D 0;=0D
+ mRiscVPass1Offset =3D 0;=0D
+ mRiscVPass1GotFixup =3D 0;=0D
break;=0D
=0D
case R_RISCV_ADD64:=0D
@@ -586,6 +630,7 @@ WriteSectionRiscV64 (
case R_RISCV_GPREL_I:=0D
case R_RISCV_GPREL_S:=0D
case R_RISCV_CALL:=0D
+ case R_RISCV_CALL_PLT:=0D
case R_RISCV_RVC_BRANCH:=0D
case R_RISCV_RVC_JUMP:=0D
case R_RISCV_RELAX:=0D
@@ -1528,6 +1573,7 @@ WriteRelocations64 (
case R_RISCV_GPREL_I:=0D
case R_RISCV_GPREL_S:=0D
case R_RISCV_CALL:=0D
+ case R_RISCV_CALL_PLT:=0D
case R_RISCV_RVC_BRANCH:=0D
case R_RISCV_RVC_JUMP:=0D
case R_RISCV_RELAX:=0D
@@ -1537,6 +1583,7 @@ WriteRelocations64 (
case R_RISCV_SET16:=0D
case R_RISCV_SET32:=0D
case R_RISCV_PCREL_HI20:=0D
+ case R_RISCV_GOT_HI20:=0D
case R_RISCV_PCREL_LO12_I:=0D
break;=0D
=0D
--=20
2.25.1


Re: [PATCH v1 5/5] ArmVirtPkg: Enable Acpiview for ArmVirtPkg

Laszlo Ersek
 

On 06/24/21 14:59, Laszlo Ersek wrote:
On 06/23/21 16:06, PierreGondois wrote:
From: Sami Mujawar <sami.mujawar@arm.com>

Acpiview is a command line tool allowing to display, dump, or
check installed ACPI tables. Add the tool to ArmVirt platforms.

Signed-off-by: Sami Mujawar <sami.mujawar@arm.com>
Signed-off-by: Pierre Gondois <Pierre.Gondois@arm.com>
---
ArmVirtPkg/ArmVirt.dsc.inc | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/ArmVirtPkg/ArmVirt.dsc.inc b/ArmVirtPkg/ArmVirt.dsc.inc
index d9abadbe708c..269ac4990a6c 100644
--- a/ArmVirtPkg/ArmVirt.dsc.inc
+++ b/ArmVirtPkg/ArmVirt.dsc.inc
@@ -1,5 +1,5 @@
#
-# Copyright (c) 2011-2015, ARM Limited. All rights reserved.
+# Copyright (c) 2011-2021, Arm Limited. All rights reserved.
# Copyright (c) 2014, Linaro Limited. All rights reserved.
# Copyright (c) 2015 - 2018, Intel Corporation. All rights reserved.
# Copyright (c) Microsoft Corporation.
@@ -398,6 +398,7 @@ [Components.common]
NULL|ShellPkg/Library/UefiShellLevel3CommandsLib/UefiShellLevel3CommandsLib.inf
NULL|ShellPkg/Library/UefiShellDriver1CommandsLib/UefiShellDriver1CommandsLib.inf
NULL|ShellPkg/Library/UefiShellDebug1CommandsLib/UefiShellDebug1CommandsLib.inf
+ NULL|ShellPkg/Library/UefiShellAcpiViewCommandLib/UefiShellAcpiViewCommandLib.inf
NULL|ShellPkg/Library/UefiShellInstall1CommandsLib/UefiShellInstall1CommandsLib.inf
NULL|ShellPkg/Library/UefiShellNetwork1CommandsLib/UefiShellNetwork1CommandsLib.inf
!if $(NETWORK_IP6_ENABLE) == TRUE
I disagree with this patch, as it will cause the Shell binary in all
ArmVirtPkg platforms to include the (rather large) ACPIVIEW command.

ACPIVIEW is super useful for when the tables are (dynamically) generated
by the firmware itself, but that does not apply to the Qemu and Xen
platforms.

Note NETWORK_IP6_ENABLE: UefiShellNetwork2CommandsLib is only hooked
into the shell application if NETWORK_IP6_ENABLE is TRUE.

Please add

DEFINE ACPIVIEW_ENABLE = TRUE

to "ArmVirtPkg/ArmVirtKvmTool.dsc",
To clarify: please place

DEFINE ACPIVIEW_ENABLE = TRUE

in a new [Defines.AARCH64] section in "ArmVirtPkg/ArmVirtKvmTool.dsc",
not in the existent [Defines] section.

This should happen just before !including "ArmVirtPkg/ArmVirt.dsc.inc".

Thanks
Laszlo

and in "ArmVirtPkg/ArmVirt.dsc.inc",
include the new command lib conditionally on ACPIVIEW_ENABLE being TRUE.
(Can be in the same patch.)

Acked-by: Laszlo Ersek <lersek@redhat.com>


Thanks
Laszlo


Re: [PATCH v1 0/5] Add ACPI support for Kvmtool

Laszlo Ersek
 

On 06/23/21 16:06, PierreGondois wrote:

Pierre Gondois (1):
ArmVirtPkg: Add cspell exceptions

Sami Mujawar (4):
ArmVirtPkg: Add DSDT ACPI table for Kvmtool firmware
ArmVirtPkg: Add Configuration Manager for Kvmtool firmware
ArmVirtPkg: Enable ACPI support for Kvmtool
ArmVirtPkg: Enable Acpiview for ArmVirtPkg
The subject lines of Sami's 4 patches should be updated as follows:

ArmVirtPkg/Kvmtool: Add DSDT ACPI table
ArmVirtPkg/Kvmtool: Add Configuration Manager
ArmVirtPkg/Kvmtool: Enable ACPI support
ArmVirtPkg/Kvmtool: Enable Acpiview

(regarding the last patch, I requested in its subthread that ACPIVIEW be
restricted to kvmtool please.)

Thanks
Laszlo


Re: [PATCH v1 5/5] ArmVirtPkg: Enable Acpiview for ArmVirtPkg

Laszlo Ersek
 

On 06/23/21 16:06, PierreGondois wrote:
From: Sami Mujawar <sami.mujawar@arm.com>

Acpiview is a command line tool allowing to display, dump, or
check installed ACPI tables. Add the tool to ArmVirt platforms.

Signed-off-by: Sami Mujawar <sami.mujawar@arm.com>
Signed-off-by: Pierre Gondois <Pierre.Gondois@arm.com>
---
ArmVirtPkg/ArmVirt.dsc.inc | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/ArmVirtPkg/ArmVirt.dsc.inc b/ArmVirtPkg/ArmVirt.dsc.inc
index d9abadbe708c..269ac4990a6c 100644
--- a/ArmVirtPkg/ArmVirt.dsc.inc
+++ b/ArmVirtPkg/ArmVirt.dsc.inc
@@ -1,5 +1,5 @@
#
-# Copyright (c) 2011-2015, ARM Limited. All rights reserved.
+# Copyright (c) 2011-2021, Arm Limited. All rights reserved.
# Copyright (c) 2014, Linaro Limited. All rights reserved.
# Copyright (c) 2015 - 2018, Intel Corporation. All rights reserved.
# Copyright (c) Microsoft Corporation.
@@ -398,6 +398,7 @@ [Components.common]
NULL|ShellPkg/Library/UefiShellLevel3CommandsLib/UefiShellLevel3CommandsLib.inf
NULL|ShellPkg/Library/UefiShellDriver1CommandsLib/UefiShellDriver1CommandsLib.inf
NULL|ShellPkg/Library/UefiShellDebug1CommandsLib/UefiShellDebug1CommandsLib.inf
+ NULL|ShellPkg/Library/UefiShellAcpiViewCommandLib/UefiShellAcpiViewCommandLib.inf
NULL|ShellPkg/Library/UefiShellInstall1CommandsLib/UefiShellInstall1CommandsLib.inf
NULL|ShellPkg/Library/UefiShellNetwork1CommandsLib/UefiShellNetwork1CommandsLib.inf
!if $(NETWORK_IP6_ENABLE) == TRUE
I disagree with this patch, as it will cause the Shell binary in all
ArmVirtPkg platforms to include the (rather large) ACPIVIEW command.

ACPIVIEW is super useful for when the tables are (dynamically) generated
by the firmware itself, but that does not apply to the Qemu and Xen
platforms.

Note NETWORK_IP6_ENABLE: UefiShellNetwork2CommandsLib is only hooked
into the shell application if NETWORK_IP6_ENABLE is TRUE.

Please add

DEFINE ACPIVIEW_ENABLE = TRUE

to "ArmVirtPkg/ArmVirtKvmTool.dsc", and in "ArmVirtPkg/ArmVirt.dsc.inc",
include the new command lib conditionally on ACPIVIEW_ENABLE being TRUE.
(Can be in the same patch.)

Acked-by: Laszlo Ersek <lersek@redhat.com>


Thanks
Laszlo


Re: [PATCH v1 4/5] ArmVirtPkg: Enable ACPI support for Kvmtool

Laszlo Ersek
 

On 06/23/21 16:06, PierreGondois wrote:
From: Sami Mujawar <sami.mujawar@arm.com>

A Configuration Manager that uses the Dynamic Tables framework
to generate ACPI tables for Kvmtool Guests has been provided.
This Configuration Manager uses the FdtHwInfoParser module to
parse the Kvmtool Device Tree and generate the required
Configuration Manager objects for generating the ACPI tables.

Therefore, enable ACPI table generation for Kvmtool.

Signed-off-by: Sami Mujawar <sami.mujawar@arm.com>
Signed-off-by: Pierre Gondois <Pierre.Gondois@arm.com>
---
ArmVirtPkg/ArmVirtKvmTool.dsc | 15 +++++++++++++--
ArmVirtPkg/ArmVirtKvmTool.fdf | 11 +++++++++++
2 files changed, 24 insertions(+), 2 deletions(-)

diff --git a/ArmVirtPkg/ArmVirtKvmTool.dsc b/ArmVirtPkg/ArmVirtKvmTool.dsc
index 920880796ac2..b02324312f18 100644
--- a/ArmVirtPkg/ArmVirtKvmTool.dsc
+++ b/ArmVirtPkg/ArmVirtKvmTool.dsc
@@ -28,6 +28,7 @@ [Defines]
FLASH_DEFINITION = ArmVirtPkg/ArmVirtKvmTool.fdf

!include ArmVirtPkg/ArmVirt.dsc.inc
+!include DynamicTablesPkg/DynamicTables.dsc.inc
(1) This doesn't seem right. In fact, ARM (not AARCH64) support claimed in "DynamicTablesPkg/DynamicTablesPkg.dsc" seems bogus in the first place; to my understanding, ACPI is not defined for 32-bit ARM.

More precisely, this !include directive is OK, but "DynamicTablesPkg/DynamicTables.dsc.inc" file should not provide a

[Components.common]

section, but a

[Components.AARCH64]

section. Refer to "ArmVirtPkg/ArmVirt.dsc.inc" please:

[Components.AARCH64]
#
# ACPI Support
#
MdeModulePkg/Universal/Acpi/AcpiTableDxe/AcpiTableDxe.inf {
<LibraryClasses>
NULL|EmbeddedPkg/Library/PlatformHasAcpiLib/PlatformHasAcpiLib.inf
}



!include MdePkg/MdeLibs.dsc.inc

@@ -144,6 +145,11 @@ [PcdsFixedAtBuild.common]
#
gEmbeddedTokenSpaceGuid.PcdPrePiCpuIoSize|16

+ #
+ # ACPI Table Version
+ #
+ gEfiMdeModulePkgTokenSpaceGuid.PcdAcpiExposedTableVersions|0x20
+
[PcdsPatchableInModule.common]
#
# This will be overridden in the code
(2) This hunk is superfluous. Please refer to "MdeModulePkg/MdeModulePkg.dec":

[PcdsFixedAtBuild.AARCH64, PcdsPatchableInModule.AARCH64]
gEfiMdeModulePkgTokenSpaceGuid.PcdAcpiExposedTableVersions|0x20|UINT32|0x0001004c

If you simply don't list the PCD in your DSC file, you'll get the default value, and the most restrictive access method declared for the PCD in the DEC file (here: fixed-at-build).


@@ -198,8 +204,8 @@ [PcdsDynamicDefault.common]
gEfiMdeModulePkgTokenSpaceGuid.PcdSetupVideoHorizontalResolution|640
gEfiMdeModulePkgTokenSpaceGuid.PcdSetupVideoVerticalResolution|480

- ## Force DTB
- gArmVirtTokenSpaceGuid.PcdForceNoAcpi|TRUE
+ ## Set default option to ACPI
+ gArmVirtTokenSpaceGuid.PcdForceNoAcpi|FALSE

# Setup Flash storage variables
gEfiMdeModulePkgTokenSpaceGuid.PcdFlashNvStorageVariableBase|0
(3) Same story as (2), please refer to "ArmVirtPkg/ArmVirtPkg.dec":

[PcdsDynamic]
#
# Whether to force disable ACPI, regardless of the fw_cfg settings
# exposed by QEMU
#
gArmVirtTokenSpaceGuid.PcdForceNoAcpi|0x0|BOOLEAN|0x00000003


@@ -356,3 +362,8 @@ [Components.common]
}
OvmfPkg/VirtioPciDeviceDxe/VirtioPciDeviceDxe.inf
OvmfPkg/Virtio10Dxe/Virtio10.inf
+ #
+ # ACPI Support
+ #
+ MdeModulePkg/Universal/Acpi/AcpiTableDxe/AcpiTableDxe.inf
(4) Superfluous by virtue of including "ArmVirtPkg/ArmVirt.dsc.inc" already.


+ ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManagerDxe.inf
(5) This should be in a [Components.AARCH64] section, not in [Components.common].


diff --git a/ArmVirtPkg/ArmVirtKvmTool.fdf b/ArmVirtPkg/ArmVirtKvmTool.fdf
index 076155199905..5ba4c579f050 100644
--- a/ArmVirtPkg/ArmVirtKvmTool.fdf
+++ b/ArmVirtPkg/ArmVirtKvmTool.fdf
@@ -204,6 +204,17 @@ [FV.FvMain]
INF OvmfPkg/VirtioPciDeviceDxe/VirtioPciDeviceDxe.inf
INF OvmfPkg/Virtio10Dxe/Virtio10.inf

+ #
+ # ACPI Support
+ #
+ INF MdeModulePkg/Universal/Acpi/AcpiTableDxe/AcpiTableDxe.inf
+ #
+ # Dynamic Table fdf
+ #
+ !include DynamicTablesPkg/DynamicTables.fdf.inc
+
+ INF ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManagerDxe.inf
+
#
# TianoCore logo (splash screen)
#
(6) Please do what "ArmVirtPkg/ArmVirtQemuFvMain.fdf.inc" does:

!if $(ARCH) == AARCH64
INF MdeModulePkg/Universal/Acpi/AcpiTableDxe/AcpiTableDxe.inf
...
!endif


Acked-by: Laszlo Ersek <lersek@redhat.com>

Thanks
Laszlo


Re: [PATCH v1 3/5] ArmVirtPkg: Add Configuration Manager for Kvmtool firmware

Laszlo Ersek
 

On 06/23/21 16:06, PierreGondois wrote:

diff --git a/ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManagerDxe.inf b/ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManagerDxe.inf
new file mode 100644
index 000000000000..9f0bf72fce2d
--- /dev/null
+++ b/ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManagerDxe.inf
+#
+# The following information is for reference only and not required by the build tools.
+#
+# VALID_ARCHITECTURES = ARM AARCH64
+#
(20) ACPI is undefined for ARM, therefore VALID_ARCHITECTURES should
only list AARCH64.

Thanks
Laszlo


Re: [PATCH v1 3/5] ArmVirtPkg: Add Configuration Manager for Kvmtool firmware

Laszlo Ersek
 

On 06/23/21 16:06, PierreGondois wrote:
From: Sami Mujawar <sami.mujawar@arm.com>

Add Configuration Manager to enable ACPI tables for Kvmtool
firmware. The Configuration Manager for Kvmtool uses the DT
Hardware Information Parser module (FdtHwInfoParser) to parse
the DT provided by Kvmtool. The FdtHwInfoParser parses the DT
and invokes the callback function HW_INFO_ADD_OBJECT to add
the Configuration Manager objects to the Platform Information
repository.

The information for some Configuration Manager objects may not
be available in the DT. Such objects are initialised locally
by the Configuration Manager.

Support for the following ACPI tables is provided:
- DBG2
- DSDT (Empty stub)
- FADT
- GTDT
- MADT
- SPCR
- SSDT (Cpu Hierarchy)
- SSDT (Pcie bus)

Signed-off-by: Sami Mujawar <sami.mujawar@arm.com>
Signed-off-by: Pierre Gondois <Pierre.Gondois@arm.com>
---
ArmVirtPkg/ArmVirtKvmTool.dsc | 3 +
.../KvmtoolCfgMgrDxe/ConfigurationManager.c | 948 ++++++++++++++++++
.../KvmtoolCfgMgrDxe/ConfigurationManager.h | 94 ++
.../ConfigurationManagerDxe.inf | 58 ++
4 files changed, 1103 insertions(+)
create mode 100644 ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManager.c
create mode 100644 ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManager.h
create mode 100644 ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManagerDxe.inf

diff --git a/ArmVirtPkg/ArmVirtKvmTool.dsc b/ArmVirtPkg/ArmVirtKvmTool.dsc
index 3bd1cc72a1eb..920880796ac2 100644
--- a/ArmVirtPkg/ArmVirtKvmTool.dsc
+++ b/ArmVirtPkg/ArmVirtKvmTool.dsc
@@ -71,6 +71,9 @@ [LibraryClasses.common]
PlatformHookLib|ArmVirtPkg/Library/Fdt16550SerialPortHookLib/Fdt16550SerialPortHookLib.inf
SerialPortLib|MdeModulePkg/Library/BaseSerialPortLib16550/BaseSerialPortLib16550.inf

+ HwInfoParserLib|DynamicTablesPkg/Library/FdtHwInfoParserLib/FdtHwInfoParserLib.inf
+ DynamicPlatRepoLib|DynamicTablesPkg/Library/Common/DynamicPlatRepoLib/DynamicPlatRepoLib.inf
+
[LibraryClasses.common.SEC, LibraryClasses.common.PEI_CORE, LibraryClasses.common.PEIM]
PciExpressLib|MdePkg/Library/BasePciExpressLib/BasePciExpressLib.inf
PlatformHookLib|ArmVirtPkg/Library/Fdt16550SerialPortHookLib/EarlyFdt16550SerialPortHookLib.inf
(1) Please move at least the line

ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManagerDxe.inf

from the next patch into this patch.

The reason is that without that line in the DSC file, you can't build
the new driver, right after this patch is applied, even with the "-m"
option of the "build" utility. And if you cannot build the driver, then
adding the library class resolutions *here* is useless.

... Alternatively, please move the lib class resolutions to the next
patch (and don't touch the DSC file in this patch).


diff --git a/ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManager.c b/ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManager.c
new file mode 100644
index 000000000000..07b8b403dd4a
--- /dev/null
+++ b/ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManager.c
@@ -0,0 +1,948 @@
+/** @file
+ Configuration Manager Dxe
+
+ Copyright (c) 2021, Arm Limited. All rights reserved.<BR>
+
+ SPDX-License-Identifier: BSD-2-Clause-Patent
+
+ @par Glossary:
+ - Cm or CM - Configuration Manager
+ - Obj or OBJ - Object
+**/
+
+#include <IndustryStandard/DebugPort2Table.h>
+#include <IndustryStandard/IoRemappingTable.h>
+#include <IndustryStandard/MemoryMappedConfigurationSpaceAccessTable.h>
+#include <IndustryStandard/SerialPortConsoleRedirectionTable.h>
+#include <Library/BaseMemoryLib.h>
+#include <Library/DebugLib.h>
+#include <Library/DynamicPlatRepoLib.h>
+#include <Library/HobLib.h>
+#include <Library/HwInfoParserLib.h>
+#include <Library/IoLib.h>
+#include <Library/PcdLib.h>
+#include <Library/TableHelperLib.h>
+#include <Library/UefiBootServicesTableLib.h>
+#include <Protocol/AcpiTable.h>
+#include <Protocol/ConfigurationManagerProtocol.h>
+
+#include "ConfigurationManager.h"
+
+/** The platform configuration repository information.
+*/
(2) This comment style is only used on functions. For other declarations
/ definitions, please use

//
// blah
//

+STATIC
+EDKII_PLATFORM_REPOSITORY_INFO KvmtoolPlatRepositoryInfo = {
+ /// Configuration Manager information
(3) "///" is sometimes used in edk2 indeed, but here it is inconsistent
with the other "//" usages

(4) please add leading and trailing empty comment lines (elsewhere too)

(5) all module global variables must start with a lower-case "m". Such
as "mKvmtoolPlatRepositoryInfo".

+ { CONFIGURATION_MANAGER_REVISION, CFG_MGR_OEM_ID },
+
+ // ACPI Table List
+ {
+ // FADT Table
+ {
+ EFI_ACPI_6_3_FIXED_ACPI_DESCRIPTION_TABLE_SIGNATURE,
+ EFI_ACPI_6_3_FIXED_ACPI_DESCRIPTION_TABLE_REVISION,
+ CREATE_STD_ACPI_TABLE_GEN_ID (EStdAcpiTableIdFadt),
+ NULL
+ },
+ // GTDT Table
+ {
+ EFI_ACPI_6_3_GENERIC_TIMER_DESCRIPTION_TABLE_SIGNATURE,
+ EFI_ACPI_6_3_GENERIC_TIMER_DESCRIPTION_TABLE_REVISION,
+ CREATE_STD_ACPI_TABLE_GEN_ID (EStdAcpiTableIdGtdt),
+ NULL
+ },
+ // MADT Table
+ {
+ EFI_ACPI_6_3_MULTIPLE_APIC_DESCRIPTION_TABLE_SIGNATURE,
+ EFI_ACPI_6_3_MULTIPLE_APIC_DESCRIPTION_TABLE_REVISION,
+ CREATE_STD_ACPI_TABLE_GEN_ID (EStdAcpiTableIdMadt),
+ NULL
+ },
+ // SPCR Table
+ {
+ EFI_ACPI_6_3_SERIAL_PORT_CONSOLE_REDIRECTION_TABLE_SIGNATURE,
+ EFI_ACPI_SERIAL_PORT_CONSOLE_REDIRECTION_TABLE_REVISION,
+ CREATE_STD_ACPI_TABLE_GEN_ID (EStdAcpiTableIdSpcr),
+ NULL
+ },
+ // DSDT Table
+ {
+ EFI_ACPI_6_3_DIFFERENTIATED_SYSTEM_DESCRIPTION_TABLE_SIGNATURE,
+ 0, // Unused
+ CREATE_STD_ACPI_TABLE_GEN_ID (EStdAcpiTableIdDsdt),
+ (EFI_ACPI_DESCRIPTION_HEADER*)dsdt_aml_code
+ },
+ // SSDT Cpu Hierarchy Table
+ {
+ EFI_ACPI_6_3_SECONDARY_SYSTEM_DESCRIPTION_TABLE_SIGNATURE,
+ 0, // Unused
+ CREATE_STD_ACPI_TABLE_GEN_ID (EStdAcpiTableIdSsdtCpuTopology),
+ NULL
+ },
+ // DBG2 Table
+ {
+ EFI_ACPI_6_3_DEBUG_PORT_2_TABLE_SIGNATURE,
+ EFI_ACPI_DBG2_DEBUG_DEVICE_INFORMATION_STRUCT_REVISION,
+ CREATE_STD_ACPI_TABLE_GEN_ID (EStdAcpiTableIdDbg2),
+ NULL
+ },
+ // PCI MCFG Table
+ {
+ EFI_ACPI_6_3_PCI_EXPRESS_MEMORY_MAPPED_CONFIGURATION_SPACE_BASE_ADDRESS_DESCRIPTION_TABLE_SIGNATURE,
+ EFI_ACPI_MEMORY_MAPPED_CONFIGURATION_SPACE_ACCESS_TABLE_REVISION,
+ CREATE_STD_ACPI_TABLE_GEN_ID (EStdAcpiTableIdMcfg),
+ NULL
+ },
+ // SSDT table describing the PCI root complex
+ {
+ EFI_ACPI_6_3_SECONDARY_SYSTEM_DESCRIPTION_TABLE_SIGNATURE,
+ 0, // Unused
+ CREATE_STD_ACPI_TABLE_GEN_ID (EStdAcpiTableIdSsdtPciExpress),
+ NULL
+ },
+ // IORT Table
+ {
+ EFI_ACPI_6_3_IO_REMAPPING_TABLE_SIGNATURE,
+ EFI_ACPI_IO_REMAPPING_TABLE_REVISION,
+ CREATE_STD_ACPI_TABLE_GEN_ID (EStdAcpiTableIdIort),
+ NULL
+ },
+ },
+
+ // Power management profile information
+ { EFI_ACPI_6_3_PM_PROFILE_ENTERPRISE_SERVER }, // PowerManagement Profile
+
+ // ITS group node
+ {
+ // Reference token for this Iort node
+ REFERENCE_TOKEN (ItsGroupInfo),
+ // The number of ITS identifiers in the ITS node.
+ 1,
+ // Reference token for the ITS identifier array
+ REFERENCE_TOKEN (ItsIdentifierArray)
+ },
+ // ITS identifier array
+ {
+ {
+ // The ITS Identifier
+ 0
+ }
+ },
+
+ // Root Complex node info
+ {
+ // Reference token for this Iort node
+ REFERENCE_TOKEN (RootComplexInfo),
+ // Number of ID mappings
+ 1,
+ // Reference token for the ID mapping array
+ REFERENCE_TOKEN (DeviceIdMapping[0]),
+
+ // Memory access properties : Cache coherent attributes
+ EFI_ACPI_IORT_MEM_ACCESS_PROP_CCA,
+ // Memory access properties : Allocation hints
+ 0,
+ // Memory access properties : Memory access flags
+ 0,
+ // ATS attributes
+ EFI_ACPI_IORT_ROOT_COMPLEX_ATS_UNSUPPORTED,
+ // PCI segment number
+ 0
+ },
+
+ // Array of Device ID mappings
+ {
+ /* RootComplex -> ITS Group
+ */
+ // Device ID mapping for Root complex node
+ {
+ // Input base
+ 0x0,
+ // Number of input IDs
+ 0x0000FFFF,
+ // Output Base
+ 0x0,
+ // Output reference
+ REFERENCE_TOKEN (ItsGroupInfo),
+ // Flags
+ 0
+ },
+ },
+};
+
+/** A helper function for returning the Configuration Manager Objects.
(6) I think we tend to keep the "/**" comment starter alone on the line,
and the actual function description starts at the next line.

I suggest updating all the other comment blocks, too.

+
+ @param [in] CmObjectId The Configuration Manager Object ID.
+ @param [in] Object Pointer to the Object(s).
+ @param [in] ObjectSize Total size of the Object(s).
+ @param [in] ObjectCount Number of Objects.
+ @param [in, out] CmObjectDesc Pointer to the Configuration Manager Object
+ descriptor describing the requested Object.
+
+ @retval EFI_SUCCESS Success.
+**/
+STATIC
+EFI_STATUS
+EFIAPI
+HandleCmObject (
+ IN CONST CM_OBJECT_ID CmObjectId,
+ IN VOID * Object,
+ IN CONST UINTN ObjectSize,
+ IN CONST UINTN ObjectCount,
+ IN OUT CM_OBJ_DESCRIPTOR * CONST CmObjectDesc
+ )
+{
+ CmObjectDesc->ObjectId = CmObjectId;
+ CmObjectDesc->Size = ObjectSize;
+ CmObjectDesc->Data = (VOID*)Object;
(7) This cast is useless. Please update the rest of the code too, if
necessary.

+ CmObjectDesc->Count = ObjectCount;
+ DEBUG ((
+ DEBUG_INFO,
+ "INFO: CmObjectId = %x, Ptr = 0x%p, Size = %d, Count = %d\n",
+ CmObjectId,
+ CmObjectDesc->Data,
+ CmObjectDesc->Size,
+ CmObjectDesc->Count
+ ));
(8) Please audit the format specifications in all of your DEBUGs.

First, wherever you introduce the CM_OBJECT_ID typedef, please also
#define a new format string macro for printing objects of type
CM_OBJECT_ID. Then, in DEBUG format strings, please use that typedef,
rather than an open-coded %x. "%x" may be proper (I can't tell), but it
should be hidden behind a macro.

#define FMT_CM_OBJECT_ID "%x"

DEBUG ((DEBUG_INFO, "blah " FMT_CM_OBJECT_ID " blah\n", CmObjectId));


Second, *assuming* "CmObjectDesc->Size" has type UINTN, the right way to
print it is not with %d, but (a) casting the value to UINT64, (b)
printing the converted value with %Lu. Same for "CmObjectDesc->Count".


+ return EFI_SUCCESS;
+}
+
+/** A helper function for returning the Configuration Manager Objects that
+ match the token.
+
+ @param [in] This Pointer to the Configuration Manager Protocol.
+ @param [in] CmObjectId The Configuration Manager Object ID.
+ @param [in] Object Pointer to the Object(s).
+ @param [in] ObjectSize Total size of the Object(s).
+ @param [in] ObjectCount Number of Objects.
+ @param [in] Token A token identifying the object.
+ @param [in] HandlerProc A handler function to search the object
+ referenced by the token.
+ @param [in, out] CmObjectDesc Pointer to the Configuration Manager Object
+ descriptor describing the requested Object.
+
+ @retval EFI_SUCCESS Success.
+ @retval EFI_INVALID_PARAMETER A parameter is invalid.
+ @retval EFI_NOT_FOUND The required object information is not found.
+**/
+STATIC
+EFI_STATUS
+EFIAPI
+HandleCmObjectRefByToken (
+ IN CONST EDKII_CONFIGURATION_MANAGER_PROTOCOL * CONST This,
+ IN CONST CM_OBJECT_ID CmObjectId,
+ IN VOID * Object,
+ IN CONST UINTN ObjectSize,
+ IN CONST UINTN ObjectCount,
+ IN CONST CM_OBJECT_TOKEN Token,
+ IN CONST CM_OBJECT_HANDLER_PROC HandlerProc,
+ IN OUT CM_OBJ_DESCRIPTOR * CONST CmObjectDesc
+ )
+{
+ EFI_STATUS Status;
+ CmObjectDesc->ObjectId = CmObjectId;
+ if (Token == CM_NULL_TOKEN) {
+ CmObjectDesc->Size = ObjectSize;
+ CmObjectDesc->Data = (VOID*)Object;
+ CmObjectDesc->Count = ObjectCount;
+ Status = EFI_SUCCESS;
+ } else {
+ Status = HandlerProc (This, CmObjectId, Token, CmObjectDesc);
+ }
+
+ DEBUG ((
+ DEBUG_INFO,
+ "INFO: Token = 0x%p, CmObjectId = %x, Ptr = 0x%p, Size = %d, Count = %d\n",
+ (VOID*)Token,
+ CmObjectId,
+ CmObjectDesc->Data,
+ CmObjectDesc->Size,
+ CmObjectDesc->Count
+ ));
+ return Status;
+}
+
+/** Return an ITS identifier array.
+
+ @param [in] This Pointer to the Configuration Manager Protocol.
+ @param [in] CmObjectId The Configuration Manager Object ID.
+ @param [in] Token A token for identifying the object
+ @param [out] CmObject Pointer to the Configuration Manager Object
+ descriptor describing the requested Object.
+
+ @retval EFI_SUCCESS Success.
+ @retval EFI_INVALID_PARAMETER A parameter is invalid.
+ @retval EFI_NOT_FOUND The required object information is not found.
+**/
+EFI_STATUS
+EFIAPI
+GetItsIdentifierArray (
+ IN CONST EDKII_CONFIGURATION_MANAGER_PROTOCOL * CONST This,
+ IN CONST CM_OBJECT_ID CmObjectId,
+ IN CONST CM_OBJECT_TOKEN Token,
+ OUT CM_OBJ_DESCRIPTOR * CONST CmObject
+ )
+{
+ EDKII_PLATFORM_REPOSITORY_INFO * PlatformRepo;
+
+ if ((This == NULL) || (CmObject == NULL)) {
+ ASSERT (0);
(9) should be ASSERT (FALSE) -- please update all instances.

Better yet: in GetStandardNameSpaceObject(), you use ASSERT()
expressions that make a lot more sense, when logged. I suggest replacing
all the ASSERT (0) calls with that (more useful) pattern.

+ return EFI_INVALID_PARAMETER;
+ }
+
+ PlatformRepo = This->PlatRepoInfo;
+
+ if (Token != (CM_OBJECT_TOKEN)&PlatformRepo->ItsIdentifierArray) {
+ return EFI_NOT_FOUND;
+ }
+
+ CmObject->ObjectId = CmObjectId;
+ CmObject->Size = sizeof (PlatformRepo->ItsIdentifierArray);
+ CmObject->Data = (VOID*)&PlatformRepo->ItsIdentifierArray;
+ CmObject->Count = ARRAY_SIZE (PlatformRepo->ItsIdentifierArray);
+ return EFI_SUCCESS;
+}
+
+/** Return a device Id mapping array.
+
+ @param [in] This Pointer to the Configuration Manager Protocol.
+ @param [in] CmObjectId The Configuration Manager Object ID.
+ @param [in] Token A token for identifying the object
+ @param [out] CmObject Pointer to the Configuration Manager Object
+ descriptor describing the requested Object.
+
+ @retval EFI_SUCCESS Success.
+ @retval EFI_INVALID_PARAMETER A parameter is invalid.
+ @retval EFI_NOT_FOUND The required object information is not found.
+**/
+EFI_STATUS
+EFIAPI
+GetDeviceIdMappingArray (
+ IN CONST EDKII_CONFIGURATION_MANAGER_PROTOCOL * CONST This,
+ IN CONST CM_OBJECT_ID CmObjectId,
+ IN CONST CM_OBJECT_TOKEN Token,
+ OUT CM_OBJ_DESCRIPTOR * CONST CmObject
+ )
+{
+ EDKII_PLATFORM_REPOSITORY_INFO * PlatformRepo;
+
+ if ((This == NULL) || (CmObject == NULL)) {
+ ASSERT (0);
+ return EFI_INVALID_PARAMETER;
+ }
+
+ PlatformRepo = This->PlatRepoInfo;
+
+ if (Token != (CM_OBJECT_TOKEN)&PlatformRepo->DeviceIdMapping[0]) {
+ return EFI_NOT_FOUND;
+ }
+
+ CmObject->ObjectId = CmObjectId;
+ CmObject->Size = sizeof (CM_ARM_ID_MAPPING);
+ CmObject->Data = (VOID*)Token;
+ CmObject->Count = 1;
+ return EFI_SUCCESS;
+}
+
+/** Function pointer called by the parser to add information.
+
+ Callback function that the parser can use to add new
+ CmObj. This function must copy the CmObj data and not rely on
+ the parser preserving the CmObj memory.
+ This function is responsible of the Token allocation.
+
+ @param [in] ParserHandle A handle to the parser instance.
+ @param [in] Context A pointer to the caller's context provided in
+ HwInfoParserInit ().
+ @param [in] CmObjDesc CM_OBJ_DESCRIPTOR containing the CmObj(s) to add.
+ @param [out] Token If provided and success, contain the token
+ generated for the CmObj.
+
+ @retval EFI_SUCCESS The function completed successfully.
+ @retval EFI_INVALID_PARAMETER Invalid parameter.
+**/
+STATIC
+EFI_STATUS
+EFIAPI
+HwInfoAdd (
+ IN HW_INFO_PARSER_HANDLE ParserHandle,
+ IN VOID * Context,
+ IN CONST CM_OBJ_DESCRIPTOR * CmObjDesc,
+ OUT CM_OBJECT_TOKEN * Token OPTIONAL
+ )
+{
+ EFI_STATUS Status;
+ EDKII_PLATFORM_REPOSITORY_INFO * PlatformRepo;
+
+ if ((ParserHandle == NULL) ||
+ (Context == NULL) ||
+ (CmObjDesc == NULL)) {
+ ASSERT (0);
+ return EFI_INVALID_PARAMETER;
+ }
+
+ PlatformRepo = (EDKII_PLATFORM_REPOSITORY_INFO*)Context;
+
+#ifndef MDEPKG_NDEBUG
+ // Print the received objects.
+ ParseCmObjDesc (CmObjDesc);
+#endif
(10) Please use the DEBUG_CODE() or DEBUG_CODE_BEGIN() /
DEBUG_CODE_END() macros here.

+
+ Status = DynPlatRepoAddObject (
+ PlatformRepo->DynamicPlatformRepo,
+ CmObjDesc,
+ Token
+ );
+ if (EFI_ERROR (Status)) {
+ ASSERT (0);
+ }
(11) Please use the dedicated macro ASSERT_EFI_ERROR().

(There are other instances of the same, please update those as well.)

+ return Status;
+}
+
+/** Cleanup the platform configuration repository.
+
+ @param [in] This Pointer to the Configuration Manager Protocol.
+
+ @retval EFI_SUCCESS Success
+ @retval EFI_INVALID_PARAMETER A parameter is invalid.
+**/
+STATIC
+EFI_STATUS
+EFIAPI
+CleanupPlatformRepository (
+ IN CONST EDKII_CONFIGURATION_MANAGER_PROTOCOL * CONST This
+ )
+{
+ EFI_STATUS Status;
+ EDKII_PLATFORM_REPOSITORY_INFO * PlatformRepo;
+
+ if (This == NULL) {
+ ASSERT (0);
+ return EFI_INVALID_PARAMETER;
+ }
+
+ PlatformRepo = This->PlatRepoInfo;
+
+ // Shutdown the dynamic repo and free all objects.
+ Status = DynamicPlatRepoShutdown (PlatformRepo->DynamicPlatformRepo);
+ if (EFI_ERROR (Status)) {
+ ASSERT (0);
+ return Status;
+ }
+
+ // Shutdown parser.
+ Status = HwInfoParserShutdown (PlatformRepo->FdtParserHandle);
+ if (EFI_ERROR (Status)) {
+ ASSERT (0);
+ }
+ return Status;
+}
+
+/** Initialize the platform configuration repository.
+
+ @param [in] This Pointer to the Configuration Manager Protocol.
+
+ @retval EFI_SUCCESS Success
+ @retval EFI_INVALID_PARAMETER A parameter is invalid.
+ @retval EFI_OUT_OF_RESOURCES An allocation has failed.
+**/
+STATIC
+EFI_STATUS
+EFIAPI
+InitializePlatformRepository (
+ IN CONST EDKII_CONFIGURATION_MANAGER_PROTOCOL * CONST This
+ )
+{
+ EFI_STATUS Status;
+ EDKII_PLATFORM_REPOSITORY_INFO * PlatformRepo;
+ VOID * Hob;
+
+ if (This == NULL) {
+ ASSERT (0);
+ return EFI_INVALID_PARAMETER;
+ }
+
+ Hob = GetFirstGuidHob (&gFdtHobGuid);
+ if (Hob == NULL || GET_GUID_HOB_DATA_SIZE (Hob) != sizeof (UINT64)) {
+ ASSERT (0);
+ return EFI_NOT_FOUND;
+ }
+
+ PlatformRepo = This->PlatRepoInfo;
+ PlatformRepo->FdtBase = (VOID *)*(UINTN*)GET_GUID_HOB_DATA (Hob);
+
+ // Initialise the dynamic platform repository.
+ Status = DynamicPlatRepoInit (&PlatformRepo->DynamicPlatformRepo);
+ if (EFI_ERROR (Status)) {
+ ASSERT (0);
+ return Status;
+ }
+
+ // Initialise the FDT parser
+ Status = HwInfoParserInit (
+ PlatformRepo->FdtBase,
+ PlatformRepo,
+ HwInfoAdd,
+ &PlatformRepo->FdtParserHandle
+ );
+ if (EFI_ERROR (Status)) {
+ ASSERT (0);
+ goto error_handler;
(12) This label should be named ErrorHandler or just Error.

(Applies to ConfigurationManagerDxeInitialize() as well.)

+ }
+
+ Status = HwInfoParse (PlatformRepo->FdtParserHandle);
+ if (EFI_ERROR (Status)) {
+ ASSERT (0);
+ goto error_handler;
+ }
+
+ Status = DynamicPlatRepoFinalise (PlatformRepo->DynamicPlatformRepo);
+ if (EFI_ERROR (Status)) {
+ ASSERT (0);
+ goto error_handler;
+ }
+
+ return EFI_SUCCESS;
+
+error_handler:
+ CleanupPlatformRepository (This);
+ return Status;
+}
+
+/** Return a standard namespace object.
+
+ @param [in] This Pointer to the Configuration Manager Protocol.
+ @param [in] CmObjectId The Configuration Manager Object ID.
+ @param [in] Token An optional token identifying the object. If
+ unused this must be CM_NULL_TOKEN.
+ @param [in, out] CmObject Pointer to the Configuration Manager Object
+ descriptor describing the requested Object.
+
+ @retval EFI_SUCCESS Success.
+ @retval EFI_INVALID_PARAMETER A parameter is invalid.
+ @retval EFI_NOT_FOUND The required object information is not found.
+**/
+EFI_STATUS
+EFIAPI
+GetStandardNameSpaceObject (
+ IN CONST EDKII_CONFIGURATION_MANAGER_PROTOCOL * CONST This,
+ IN CONST CM_OBJECT_ID CmObjectId,
+ IN CONST CM_OBJECT_TOKEN Token OPTIONAL,
+ IN OUT CM_OBJ_DESCRIPTOR * CONST CmObject
+ )
+{
+ EFI_STATUS Status;
+ EDKII_PLATFORM_REPOSITORY_INFO * PlatformRepo;
+ UINTN AcpiTableCount;
+ CM_OBJ_DESCRIPTOR CmObjDesc;
+
+ if ((This == NULL) || (CmObject == NULL)) {
+ ASSERT (This != NULL);
+ ASSERT (CmObject != NULL);
+ return EFI_INVALID_PARAMETER;
+ }
+
+ Status = EFI_NOT_FOUND;
+ PlatformRepo = This->PlatRepoInfo;
+
+ switch (GET_CM_OBJECT_ID (CmObjectId)) {
+ case EStdObjCfgMgrInfo:
+ Status = HandleCmObject (
+ CmObjectId,
+ &PlatformRepo->CmInfo,
+ sizeof (PlatformRepo->CmInfo),
+ 1,
+ CmObject
+ );
+ break;
+
+ case EStdObjAcpiTableList:
+ AcpiTableCount = ARRAY_SIZE (PlatformRepo->CmAcpiTableList);
+
+ // Get Pci config space information.
+ Status = DynamicPlatRepoGetObject (
+ PlatformRepo->DynamicPlatformRepo,
+ CREATE_CM_ARM_OBJECT_ID (EArmObjPciConfigSpaceInfo),
+ CM_NULL_TOKEN,
+ &CmObjDesc
+ );
+ if (Status == EFI_NOT_FOUND) {
+ // The last 3 tables are for PCIe. If PCIe information is not
+ // present, Kvmtool was launched without the PCIe option.
+ // Therefore, reduce the table count by 3.
+ AcpiTableCount -= 3;
+ } else if (EFI_ERROR (Status)) {
+ ASSERT (0);
+ return Status;
+ }
+
+ // Get the Gic version.
+ Status = DynamicPlatRepoGetObject (
+ PlatformRepo->DynamicPlatformRepo,
+ CREATE_CM_ARM_OBJECT_ID (EArmObjGicDInfo),
+ CM_NULL_TOKEN,
+ &CmObjDesc
+ );
+ if (EFI_ERROR (Status)) {
+ ASSERT (0);
+ return Status;
+ }
+ if (((CM_ARM_GICD_INFO*)CmObjDesc.Data)->GicVersion < 3) {
+ // IORT is only required for GicV3/4
+ AcpiTableCount -= 1;
+ }
+
+ Status = HandleCmObject (
+ CmObjectId,
+ PlatformRepo->CmAcpiTableList,
+ (sizeof (PlatformRepo->CmAcpiTableList[0]) * AcpiTableCount),
+ AcpiTableCount,
+ CmObject
+ );
+ break;
+
+ default: {
(13) Please drop the brace.

(applies to ArmKvmtoolPlatformGetObject() as well)

+ Status = EFI_NOT_FOUND;
+ DEBUG ((
+ DEBUG_ERROR,
+ "ERROR: Object 0x%x. Status = %r\n",
+ CmObjectId,
+ Status
+ ));
+ break;
+ }
+ }
+
+ return Status;
+}
+
+/** Return an ARM namespace object.
+
+ @param [in] This Pointer to the Configuration Manager Protocol.
+ @param [in] CmObjectId The Configuration Manager Object ID.
+ @param [in] Token An optional token identifying the object. If
+ unused this must be CM_NULL_TOKEN.
+ @param [in, out] CmObject Pointer to the Configuration Manager Object
+ descriptor describing the requested Object.
+
+ @retval EFI_SUCCESS Success.
+ @retval EFI_INVALID_PARAMETER A parameter is invalid.
+ @retval EFI_NOT_FOUND The required object information is not found.
+**/
+EFI_STATUS
+EFIAPI
+GetArmNameSpaceObject (
+ IN CONST EDKII_CONFIGURATION_MANAGER_PROTOCOL * CONST This,
+ IN CONST CM_OBJECT_ID CmObjectId,
+ IN CONST CM_OBJECT_TOKEN Token OPTIONAL,
+ IN OUT CM_OBJ_DESCRIPTOR * CONST CmObject
+ )
+{
+ EFI_STATUS Status;
+ EDKII_PLATFORM_REPOSITORY_INFO * PlatformRepo;
+
+ if ((This == NULL) || (CmObject == NULL)) {
+ ASSERT (This != NULL);
+ ASSERT (CmObject != NULL);
+ return EFI_INVALID_PARAMETER;
+ }
+
+ Status = EFI_NOT_FOUND;
+ PlatformRepo = This->PlatRepoInfo;
+
+ // First check among the static objects.
+ switch (GET_CM_OBJECT_ID (CmObjectId)) {
+ case EArmObjPowerManagementProfileInfo:
+ Status = HandleCmObject (
+ CmObjectId,
+ &PlatformRepo->PmProfileInfo,
+ sizeof (PlatformRepo->PmProfileInfo),
+ 1,
+ CmObject
+ );
+ break;
+
+ case EArmObjItsGroup:
+ Status = HandleCmObject (
+ CmObjectId,
+ &PlatformRepo->ItsGroupInfo,
+ sizeof (PlatformRepo->ItsGroupInfo),
+ 1,
+ CmObject
+ );
+ break;
+
+ case EArmObjGicItsIdentifierArray:
+ Status = HandleCmObjectRefByToken (
+ This,
+ CmObjectId,
+ PlatformRepo->ItsIdentifierArray,
+ sizeof (PlatformRepo->ItsIdentifierArray),
+ ARRAY_SIZE (PlatformRepo->ItsIdentifierArray),
+ Token,
+ GetItsIdentifierArray,
+ CmObject
+ );
+ break;
+
+ case EArmObjRootComplex:
+ Status = HandleCmObject (
+ CmObjectId,
+ &PlatformRepo->RootComplexInfo,
+ sizeof (PlatformRepo->RootComplexInfo),
+ 1,
+ CmObject
+ );
+ break;
+
+ case EArmObjIdMappingArray:
+ Status = HandleCmObjectRefByToken (
+ This,
+ CmObjectId,
+ PlatformRepo->DeviceIdMapping,
+ sizeof (PlatformRepo->DeviceIdMapping),
+ ARRAY_SIZE (PlatformRepo->DeviceIdMapping),
+ Token,
+ GetDeviceIdMappingArray,
+ CmObject
+ );
+ break;
+
+ default:
+ // No match found among the static objects.
+ // Check the dynamic objects.
+ Status = DynamicPlatRepoGetObject (
+ PlatformRepo->DynamicPlatformRepo,
+ CmObjectId,
+ Token,
+ CmObject
+ );
+ break;
+ } // switch
+
+ if (Status == EFI_NOT_FOUND) {
+ DEBUG ((
+ DEBUG_INFO,
+ "INFO: Object 0x%x. Status = %r\n",
+ CmObjectId,
+ Status
+ ));
+ } else {
+ ASSERT_EFI_ERROR (Status);
+ }
+
+ return Status;
+}
+
+/** Return an OEM namespace object.
+
+ @param [in] This Pointer to the Configuration Manager Protocol.
+ @param [in] CmObjectId The Configuration Manager Object ID.
+ @param [in] Token An optional token identifying the object. If
+ unused this must be CM_NULL_TOKEN.
+ @param [in, out] CmObject Pointer to the Configuration Manager Object
+ descriptor describing the requested Object.
+
+ @retval EFI_SUCCESS Success.
+ @retval EFI_INVALID_PARAMETER A parameter is invalid.
+ @retval EFI_NOT_FOUND The required object information is not found.
+**/
+EFI_STATUS
+EFIAPI
+GetOemNameSpaceObject (
+ IN CONST EDKII_CONFIGURATION_MANAGER_PROTOCOL * CONST This,
+ IN CONST CM_OBJECT_ID CmObjectId,
+ IN CONST CM_OBJECT_TOKEN Token OPTIONAL,
+ IN OUT CM_OBJ_DESCRIPTOR * CONST CmObject
+ )
+{
+ EFI_STATUS Status;
+
+ Status = EFI_SUCCESS;
+ if ((This == NULL) || (CmObject == NULL)) {
+ ASSERT (This != NULL);
+ ASSERT (CmObject != NULL);
+ return EFI_INVALID_PARAMETER;
+ }
+
+ switch (GET_CM_OBJECT_ID (CmObjectId)) {
+ default: {
+ Status = EFI_NOT_FOUND;
+ DEBUG ((
+ DEBUG_ERROR,
+ "ERROR: Object 0x%x. Status = %r\n",
+ CmObjectId,
+ Status
+ ));
+ break;
+ }
+ }
+
+ return Status;
+}
+
+/** The GetObject function defines the interface implemented by the
+ Configuration Manager Protocol for returning the Configuration
+ Manager Objects.
+
+ @param [in] This Pointer to the Configuration Manager Protocol.
+ @param [in] CmObjectId The Configuration Manager Object ID.
+ @param [in] Token An optional token identifying the object. If
+ unused this must be CM_NULL_TOKEN.
+ @param [in, out] CmObject Pointer to the Configuration Manager Object
+ descriptor describing the requested Object.
+
+ @retval EFI_SUCCESS Success.
+ @retval EFI_INVALID_PARAMETER A parameter is invalid.
+ @retval EFI_NOT_FOUND The required object information is not found.
+**/
+EFI_STATUS
+EFIAPI
+ArmKvmtoolPlatformGetObject (
+ IN CONST EDKII_CONFIGURATION_MANAGER_PROTOCOL * CONST This,
+ IN CONST CM_OBJECT_ID CmObjectId,
+ IN CONST CM_OBJECT_TOKEN Token OPTIONAL,
+ IN OUT CM_OBJ_DESCRIPTOR * CONST CmObject
+ )
+{
+ EFI_STATUS Status;
+
+ if ((This == NULL) || (CmObject == NULL)) {
+ ASSERT (This != NULL);
+ ASSERT (CmObject != NULL);
+ return EFI_INVALID_PARAMETER;
+ }
+
+ switch (GET_CM_NAMESPACE_ID (CmObjectId)) {
+ case EObjNameSpaceStandard:
+ Status = GetStandardNameSpaceObject (This, CmObjectId, Token, CmObject);
+ break;
+ case EObjNameSpaceArm:
+ Status = GetArmNameSpaceObject (This, CmObjectId, Token, CmObject);
+ break;
+ case EObjNameSpaceOem:
+ Status = GetOemNameSpaceObject (This, CmObjectId, Token, CmObject);
+ break;
+ default: {
+ Status = EFI_INVALID_PARAMETER;
+ DEBUG ((
+ DEBUG_ERROR,
+ "ERROR: Unknown Namespace Object = 0x%x. Status = %r\n",
+ CmObjectId,
+ Status
+ ));
+ break;
+ }
+ }
+
+ return Status;
+}
+
+/** The SetObject function defines the interface implemented by the
+ Configuration Manager Protocol for updating the Configuration
+ Manager Objects.
+
+ @param [in] This Pointer to the Configuration Manager Protocol.
+ @param [in] CmObjectId The Configuration Manager Object ID.
+ @param [in] Token An optional token identifying the object. If
+ unused this must be CM_NULL_TOKEN.
+ @param [in] CmObject Pointer to the Configuration Manager Object
+ descriptor describing the Object.
+
+ @retval EFI_UNSUPPORTED This operation is not supported.
+**/
+EFI_STATUS
+EFIAPI
+ArmKvmtoolPlatformSetObject (
+ IN CONST EDKII_CONFIGURATION_MANAGER_PROTOCOL * CONST This,
+ IN CONST CM_OBJECT_ID CmObjectId,
+ IN CONST CM_OBJECT_TOKEN Token OPTIONAL,
+ IN CM_OBJ_DESCRIPTOR * CONST CmObject
+ )
+{
+ return EFI_UNSUPPORTED;
+}
+
+/** A structure describing the configuration manager protocol interface.
+*/
+STATIC
+CONST
+EDKII_CONFIGURATION_MANAGER_PROTOCOL KvmtoolPlatformConfigManagerProtocol = {
+ CREATE_REVISION(1,0),
+ ArmKvmtoolPlatformGetObject,
+ ArmKvmtoolPlatformSetObject,
+ &KvmtoolPlatRepositoryInfo
+};
+
+/**
+ Entrypoint of Configuration Manager Dxe.
+
+ @param ImageHandle
+ @param SystemTable
+
+ @return EFI_SUCCESS
+ @return EFI_LOAD_ERROR
+ @return EFI_OUT_OF_RESOURCES
(14) These should be @retval, not @return.

+**/
+EFI_STATUS
+EFIAPI
+ConfigurationManagerDxeInitialize (
+ IN EFI_HANDLE ImageHandle,
+ IN EFI_SYSTEM_TABLE * SystemTable
+ )
+{
+ EFI_STATUS Status;
+
+ if (PcdGetBool (PcdForceNoAcpi)) {
+ // Use DT and not ACPI.
+ return EFI_SUCCESS;
+ }
(15) This is wrong.

KvmtoolPlatformDxe already (correctly) installs either
"gEdkiiPlatformHasAcpiGuid" or "gEdkiiPlatformHasDeviceTreeGuid".
"PcdForceNoAcpi" must not be used outside of that driver.

In this driver, the DEPEX section of the INF file should list
"gEdkiiPlatformHasAcpiGuid". If ACPI is disabled, this driver should not
be dispatched at all.

+
+ Status = gBS->InstallProtocolInterface (
+ &ImageHandle,
+ &gEdkiiConfigurationManagerProtocolGuid,
+ EFI_NATIVE_INTERFACE,
+ (VOID*)&KvmtoolPlatformConfigManagerProtocol
+ );
+ if (EFI_ERROR (Status)) {
+ DEBUG ((
+ DEBUG_ERROR,
+ "ERROR: Failed to get Install Configuration Manager Protocol." \
+ " Status = %r\n",
+ Status
+ ));
+ goto error_handler;
+ }
+
+ Status = InitializePlatformRepository (
+ &KvmtoolPlatformConfigManagerProtocol
+ );
+ if (EFI_ERROR (Status)) {
+ DEBUG ((
+ DEBUG_ERROR,
+ "ERROR: Failed to initialize the Platform Configuration Repository." \
+ " Status = %r\n",
+ Status
+ ));
+ }
+
+error_handler:
+ return Status;
+}
(16) I've paid basically zero attention to your error handlers, but this
is so obviously wrong that it catches my eye. If
InitializePlatformRepository() fails and so you exit with an error, the
driver will be unloaded, and your installed protocol interface will be a
dangling one.

+
+/**
+ Unload function for this image.
+
+ @param ImageHandle Handle for the image of this driver.
+
+ @retval EFI_SUCCESS Driver unloaded successfully.
+ @return other Driver can not unloaded.
+**/
+EFI_STATUS
+EFIAPI
+ConfigurationManagerDxeUnloadImage (
+ IN EFI_HANDLE ImageHandle
+ )
+{
+ return CleanupPlatformRepository (&KvmtoolPlatformConfigManagerProtocol);
+}
diff --git a/ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManager.h b/ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManager.h
new file mode 100644
index 000000000000..94cfca3b7671
--- /dev/null
+++ b/ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManager.h
@@ -0,0 +1,94 @@
+/** @file
+
+ Copyright (c) 2021, Arm Limited. All rights reserved.<BR>
+
+ SPDX-License-Identifier: BSD-2-Clause-Patent
+
+ @par Glossary:
+ - Cm or CM - Configuration Manager
+ - Obj or OBJ - Object
+**/
+
+#ifndef CONFIGURATION_MANAGER_H_
+#define CONFIGURATION_MANAGER_H_
+
+/** C array containing the compiled AML template.
+ This symbol is defined in the auto generated C file
+ containing the AML bytecode array.
+*/
(17) Please apply all the comment observations I made for the C file to
the header file.

+extern CHAR8 dsdt_aml_code[];
+
+/** The configuration manager version.
+*/
+#define CONFIGURATION_MANAGER_REVISION CREATE_REVISION (1, 0)
+
+/** The OEM ID
+*/
+#define CFG_MGR_OEM_ID { 'A', 'R', 'M', 'L', 'T', 'D' }
+
+/** A function that prepares Configuration Manager Objects for returning.
+
+ @param [in] This Pointer to the Configuration Manager Protocol.
+ @param [in] CmObjectId The Configuration Manager Object ID.
+ @param [in] Token A token for identifying the object.
+ @param [out] CmObject Pointer to the Configuration Manager Object
+ descriptor describing the requested Object.
+
+ @retval EFI_SUCCESS Success.
+ @retval EFI_INVALID_PARAMETER A parameter is invalid.
+ @retval EFI_NOT_FOUND The required object information is not found.
+**/
+typedef EFI_STATUS (*CM_OBJECT_HANDLER_PROC) (
+ IN CONST EDKII_CONFIGURATION_MANAGER_PROTOCOL * CONST This,
+ IN CONST CM_OBJECT_ID CmObjectId,
+ IN CONST CM_OBJECT_TOKEN Token,
+ IN OUT CM_OBJ_DESCRIPTOR * CONST CmObject
+ );
+
+/** A helper macro for mapping a reference token.
+*/
+#define REFERENCE_TOKEN(Field) \
+ (CM_OBJECT_TOKEN)((UINT8*)&KvmtoolPlatRepositoryInfo + \
+ OFFSET_OF (EDKII_PLATFORM_REPOSITORY_INFO, Field))
+
+/** The number of ACPI tables to install
+*/
+#define PLAT_ACPI_TABLE_COUNT 10
+
+/** A structure describing the platform configuration
+ manager repository information
+*/
+typedef struct PlatformRepositoryInfo {
(18) the "PlatformRepositoryInfo" struct tag is not spelled
idiomatically. On the other hand, it's also not used anywhere. I suggest
simply dropping it.

+ /// Configuration Manager Information.
+ CM_STD_OBJ_CONFIGURATION_MANAGER_INFO CmInfo;
+
+ /// List of ACPI tables
+ CM_STD_OBJ_ACPI_TABLE_INFO CmAcpiTableList[PLAT_ACPI_TABLE_COUNT];
+
+ /// Power management profile information
+ CM_ARM_POWER_MANAGEMENT_PROFILE_INFO PmProfileInfo;
+
+ /// ITS Group node
+ CM_ARM_ITS_GROUP_NODE ItsGroupInfo;
+
+ /// ITS Identifier array
+ CM_ARM_ITS_IDENTIFIER ItsIdentifierArray[1];
+
+ /// PCI Root complex node
+ CM_ARM_ROOT_COMPLEX_NODE RootComplexInfo;
+
+ /// Array of DeviceID mapping
+ CM_ARM_ID_MAPPING DeviceIdMapping[1];
+
+ /// Dynamic platform repository.
+ /// CmObj created by parsing the Kvmtool device tree are stored here.
+ DYNAMIC_PLATFORM_REPOSITORY_INFO * DynamicPlatformRepo;
+
+ /// Base address of the FDT.
+ VOID * FdtBase;
+
+ /// A handle to the FDT HwInfoParser.
+ HW_INFO_PARSER_HANDLE FdtParserHandle;
+} EDKII_PLATFORM_REPOSITORY_INFO;
+
+#endif // CONFIGURATION_MANAGER_H_
diff --git a/ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManagerDxe.inf b/ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManagerDxe.inf
new file mode 100644
index 000000000000..9f0bf72fce2d
--- /dev/null
+++ b/ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManagerDxe.inf
@@ -0,0 +1,58 @@
+## @file
+# Configuration Manager Dxe
+#
+# Copyright (c) 2021, Arm Limited. All rights reserved.<BR>
+#
+# SPDX-License-Identifier: BSD-2-Clause-Patent
+##
+
+[Defines]
+ INF_VERSION = 0x0001001B
+ BASE_NAME = ConfigurationManagerDxe
+ FILE_GUID = 3C80D366-510C-4154-BB3A-E12439AD337C
+ MODULE_TYPE = DXE_DRIVER
+ VERSION_STRING = 1.0
+ ENTRY_POINT = ConfigurationManagerDxeInitialize
+ UNLOAD_IMAGE = ConfigurationManagerDxeUnloadImage
+
+#
+# The following information is for reference only and not required by the build tools.
+#
+# VALID_ARCHITECTURES = ARM AARCH64
+#
+
+[Sources]
+ AslTables/Dsdt.asl
+ ConfigurationManager.c
+ ConfigurationManager.h
+ ConfigurationManagerDxe.inf
+
+[Packages]
+ ArmVirtPkg/ArmVirtPkg.dec
+ DynamicTablesPkg/DynamicTablesPkg.dec
+ EmbeddedPkg/EmbeddedPkg.dec
+ MdePkg/MdePkg.dec
+
+[LibraryClasses]
+ DynamicPlatRepoLib
+ HobLib
+ HwInfoParserLib
+ PrintLib
+ TableHelperLib
+ UefiBootServicesTableLib
+ UefiDriverEntryPoint
+ UefiRuntimeServicesTableLib
+
+[Protocols]
+ gEdkiiConfigurationManagerProtocolGuid
+
+[Guids]
+ gFdtHobGuid
+
+[FixedPcd]
(19) please drop this empty section.

+
+[Pcd]
+ gArmVirtTokenSpaceGuid.PcdForceNoAcpi
+
+[Depex]
+ TRUE
I can't see myself reviewing this again (even at this level), so I'll
trust you on all the updates.

Acked-by: Laszlo Ersek <lersek@redhat.com>

Thanks
Laszlo


Re: [PATCH v1 0/5] Add ACPI support for Kvmtool

Laszlo Ersek
 

On 06/23/21 16:06, PierreGondois wrote:
From: Pierre Gondois <Pierre.Gondois@arm.com>

Kvmtool dynamically generates a device tree describing the platform
to boot on. Using the patch-sets listed below, the DynamicTables
framework generates ACPI tables describing a similar platform.

This patch-set:
- adds a ConfigurationManager allowing to generate ACPI tables
for Kvmtool
- adds the acpiview command line utility to the ArmVirtPkg
- update ArmVirtPkg.ci.yaml to add new words and use the
DynamicTablesPkg

This patch sets also set the default platform description format
to ACPI instead of the device tree (c.f.: PcdForceNoAcpi is set
to FALSE).

The changes can be seen at: https://github.com/PierreARM/edk2/tree/1456_Add_ACPI_support_for_Kvmtool_v1
The results of the CI can be seen at: https://github.com/tianocore/edk2/pull/1753

This patch-set is dependent over the following patch-sets:
[PATCH v1 00/10] Various DynamicTablesPkg modifications
https://edk2.groups.io/g/devel/message/76929
and:
[PATCH v1 00/13] Create a SSDT CPU topology generator
https://edk2.groups.io/g/devel/message/76941
and:
[PATCH v1 0/7] Create a SSDT PCIe generator
https://edk2.groups.io/g/devel/message/76958
and:
[PATCH v1 00/14] Implement a FdtHwInfoParserLib
https://edk2.groups.io/g/devel/message/76967
and:
[PATCH v1 0/5] Add DynamicPlatRepoLib
https://edk2.groups.io/g/devel/message/76984
Not sure if you want just one BZ for all of these subfeatures, or one BZ
per subfeature, but we definitely need at least one BZ for this series.
Please update the commit messages accordingly.

Thanks
Laszlo


Pierre Gondois (1):
ArmVirtPkg: Add cspell exceptions

Sami Mujawar (4):
ArmVirtPkg: Add DSDT ACPI table for Kvmtool firmware
ArmVirtPkg: Add Configuration Manager for Kvmtool firmware
ArmVirtPkg: Enable ACPI support for Kvmtool
ArmVirtPkg: Enable Acpiview for ArmVirtPkg

ArmVirtPkg/ArmVirt.dsc.inc | 3 +-
ArmVirtPkg/ArmVirtKvmTool.dsc | 18 +-
ArmVirtPkg/ArmVirtKvmTool.fdf | 11 +
ArmVirtPkg/ArmVirtPkg.ci.yaml | 3 +
.../KvmtoolCfgMgrDxe/AslTables/Dsdt.asl | 19 +
.../KvmtoolCfgMgrDxe/ConfigurationManager.c | 948 ++++++++++++++++++
.../KvmtoolCfgMgrDxe/ConfigurationManager.h | 94 ++
.../ConfigurationManagerDxe.inf | 58 ++
8 files changed, 1151 insertions(+), 3 deletions(-)
create mode 100644 ArmVirtPkg/KvmtoolCfgMgrDxe/AslTables/Dsdt.asl
create mode 100644 ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManager.c
create mode 100644 ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManager.h
create mode 100644 ArmVirtPkg/KvmtoolCfgMgrDxe/ConfigurationManagerDxe.inf


Re: [PATCH v1 2/5] ArmVirtPkg: Add DSDT ACPI table for Kvmtool firmware

Laszlo Ersek
 

On 06/23/21 16:06, PierreGondois wrote:
From: Sami Mujawar <sami.mujawar@arm.com>

Most ACPI tables for Kvmtool firmware are dynamically
generated. The AML code is also generated at runtime
for most components in appropriate SSDTs.

Although there may not be much to describe in the DSDT,
the DSDT table is mandatory.

Therefore, add an empty stub for DSDT.

Signed-off-by: Sami Mujawar <sami.mujawar@arm.com>
Signed-off-by: Pierre Gondois <Pierre.Gondois@arm.com>
---
.../KvmtoolCfgMgrDxe/AslTables/Dsdt.asl | 19 +++++++++++++++++++
1 file changed, 19 insertions(+)
create mode 100644 ArmVirtPkg/KvmtoolCfgMgrDxe/AslTables/Dsdt.asl

diff --git a/ArmVirtPkg/KvmtoolCfgMgrDxe/AslTables/Dsdt.asl b/ArmVirtPkg/KvmtoolCfgMgrDxe/AslTables/Dsdt.asl
new file mode 100644
index 000000000000..8467d1ede4ec
--- /dev/null
+++ b/ArmVirtPkg/KvmtoolCfgMgrDxe/AslTables/Dsdt.asl
@@ -0,0 +1,19 @@
+/** @file
+ Differentiated System Description Table Fields (DSDT)
+
+ Copyright (c) 2021, ARM Ltd. All rights reserved.<BR>
+ SPDX-License-Identifier: BSD-2-Clause-Patent
+
+**/
+
+DefinitionBlock ("DsdtTable.aml", "DSDT", 1, "ARMLTD", "ARM-KVMT", 1) {
+ Scope (_SB) {
+ // Most ACPI tables for Kvmtool firmware are
+ // dynamically generated. The AML code is also
+ // generated at runtime for most components in
+ // appropriate SSDTs.
+ // Although there may not be much to describe
+ // in the DSDT, the DSDT table is mandatory.
+ // Therefore, add an empty stub for DSDT.
+ } // Scope (_SB)
+}
Please insert empty // lines at the top and bottom of the comment block,
to stick more closely with the edk2 coding style.

Reviewed-by: Laszlo Ersek <lersek@redhat.com>


Re: [PATCH v1 1/5] ArmVirtPkg: Add cspell exceptions

Laszlo Ersek
 

On 06/23/21 16:06, PierreGondois wrote:
From: Pierre Gondois <Pierre.Gondois@arm.com>

The cpsell tool checks for unknown words in the upstream CI.
Add some new words to the list of exceptions.

Signed-off-by: Pierre Gondois <Pierre.Gondois@arm.com>
---
ArmVirtPkg/ArmVirtPkg.ci.yaml | 3 +++
1 file changed, 3 insertions(+)

diff --git a/ArmVirtPkg/ArmVirtPkg.ci.yaml b/ArmVirtPkg/ArmVirtPkg.ci.yaml
index 5f427e57233e..e3f30d69e89a 100644
--- a/ArmVirtPkg/ArmVirtPkg.ci.yaml
+++ b/ArmVirtPkg/ArmVirtPkg.ci.yaml
@@ -47,6 +47,7 @@
"MdePkg/MdePkg.dec",
"MdeModulePkg/MdeModulePkg.dec",
"ArmVirtPkg/ArmVirtPkg.dec",
+ "DynamicTablesPkg/DynamicTablesPkg.dec",
"NetworkPkg/NetworkPkg.dec",
"ArmPkg/ArmPkg.dec",
"OvmfPkg/OvmfPkg.dec",
@@ -97,6 +98,8 @@
"AuditOnly": False, # Fails right now with over 270 errors
"IgnoreFiles": [], # use gitignore syntax to ignore errors in matching files
"ExtendWords": [
+ "armltd",
+ "ssdts",
"setjump",
"plong",
"lparam",
Reviewed-by: Laszlo Ersek <lersek@redhat.com>


Re: [PATCH v8] IntelFsp2Pkg: Add Config Editor tool support

Chiu, Chasel
 

Reviewed-by: Chasel Chiu <chasel.chiu@intel.com>

-----Original Message-----
From: Loo, Tung Lun <tung.lun.loo@intel.com>
Sent: Thursday, June 24, 2021 5:14 PM
To: devel@edk2.groups.io
Cc: Loo, Tung Lun <tung.lun.loo@intel.com>; Ma, Maurice
<maurice.ma@intel.com>; Desimone, Nathaniel L
<nathaniel.l.desimone@intel.com>; Zeng, Star <star.zeng@intel.com>; Chiu,
Chasel <chasel.chiu@intel.com>
Subject: [PATCH v8] IntelFsp2Pkg: Add Config Editor tool support

This is a GUI interface that can be used by users who
would like to change configuration settings directly
from the interface without having to modify the source.

This tool depends on Python GUI tool kit Tkinter.
It runs on both Windows and Linux.

The user needs to load the YAML file along with DLT file
for a specific board into the ConfigEditor, change the desired
configuration values. Finally, generate a new configuration delta
file or a config binary blob for the newly changed values to take
effect. These will be the inputs to the merge tool or the stitch
tool so that new config changes can be merged and stitched into
the final configuration blob.

This tool also supports binary update directly and display FSP
information. It is also backward compatible for BSF file format.

Running Configuration Editor:
python ConfigEditor.py

Co-authored-by: Maurice Ma <maurice.ma@intel.com>
Cc: Maurice Ma <maurice.ma@intel.com>
Cc: Nate DeSimone <nathaniel.l.desimone@intel.com>
Cc: Star Zeng <star.zeng@intel.com>
Cc: Chasel Chiu <chasel.chiu@intel.com>
Signed-off-by: Loo Tung Lun <tung.lun.loo@intel.com>
---
IntelFsp2Pkg/Tools/ConfigEditor/CommonUtility.py | 504
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
++++++++++++++++++++++++++++++++++++++++++++++++++
IntelFsp2Pkg/Tools/ConfigEditor/ConfigEditor.py | 1499
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++
IntelFsp2Pkg/Tools/ConfigEditor/GenYamlCfg.py | 2252
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
++++++++++++++++++++++++
IntelFsp2Pkg/Tools/ConfigEditor/SingleSign.py | 324
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++
IntelFsp2Pkg/Tools/FspDscBsf2Yaml.py | 376
+++++++++++++++++++++++++++++------------------------------------------------------
----------------------------------------------------
IntelFsp2Pkg/Tools/FspGenCfgData.py | 2637
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
++++++++++++++++++++++++++++++++
6 files changed, 7295 insertions(+), 297 deletions(-)

diff --git a/IntelFsp2Pkg/Tools/ConfigEditor/CommonUtility.py
b/IntelFsp2Pkg/Tools/ConfigEditor/CommonUtility.py
new file mode 100644
index 0000000000..1229279116
--- /dev/null
+++ b/IntelFsp2Pkg/Tools/ConfigEditor/CommonUtility.py
@@ -0,0 +1,504 @@
+#!/usr/bin/env python

+# @ CommonUtility.py

+# Common utility script

+#

+# Copyright (c) 2016 - 2021, Intel Corporation. All rights reserved.<BR>

+# SPDX-License-Identifier: BSD-2-Clause-Patent

+#

+##

+

+import os

+import sys

+import shutil

+import subprocess

+import string

+from ctypes import ARRAY, c_char, c_uint16, c_uint32, \

+ c_uint8, Structure, sizeof

+from importlib.machinery import SourceFileLoader

+from SingleSign import single_sign_gen_pub_key

+

+

+# Key types defined should match with cryptolib.h

+PUB_KEY_TYPE = {

+ "RSA": 1,

+ "ECC": 2,

+ "DSA": 3,

+ }

+

+# Signing type schemes defined should match with cryptolib.h

+SIGN_TYPE_SCHEME = {

+ "RSA_PKCS1": 1,

+ "RSA_PSS": 2,

+ "ECC": 3,

+ "DSA": 4,

+ }

+

+# Hash values defined should match with cryptolib.h

+HASH_TYPE_VALUE = {

+ "SHA2_256": 1,

+ "SHA2_384": 2,

+ "SHA2_512": 3,

+ "SM3_256": 4,

+ }

+

+# Hash values defined should match with cryptolib.h

+HASH_VAL_STRING = dict(map(reversed, HASH_TYPE_VALUE.items()))

+

+AUTH_TYPE_HASH_VALUE = {

+ "SHA2_256": 1,

+ "SHA2_384": 2,

+ "SHA2_512": 3,

+ "SM3_256": 4,

+ "RSA2048SHA256": 1,

+ "RSA3072SHA384": 2,

+ }

+

+HASH_DIGEST_SIZE = {

+ "SHA2_256": 32,

+ "SHA2_384": 48,

+ "SHA2_512": 64,

+ "SM3_256": 32,

+ }

+

+

+class PUB_KEY_HDR (Structure):

+ _pack_ = 1

+ _fields_ = [

+ ('Identifier', ARRAY(c_char, 4)), # signature ('P', 'U', 'B', 'K')

+ ('KeySize', c_uint16), # Length of Public Key

+ ('KeyType', c_uint8), # RSA or ECC

+ ('Reserved', ARRAY(c_uint8, 1)),

+ ('KeyData', ARRAY(c_uint8, 0)),

+ ]

+

+ def __init__(self):

+ self.Identifier = b'PUBK'

+

+

+class SIGNATURE_HDR (Structure):

+ _pack_ = 1

+ _fields_ = [

+ ('Identifier', ARRAY(c_char, 4)),

+ ('SigSize', c_uint16),

+ ('SigType', c_uint8),

+ ('HashAlg', c_uint8),

+ ('Signature', ARRAY(c_uint8, 0)),

+ ]

+

+ def __init__(self):

+ self.Identifier = b'SIGN'

+

+

+class LZ_HEADER(Structure):

+ _pack_ = 1

+ _fields_ = [

+ ('signature', ARRAY(c_char, 4)),

+ ('compressed_len', c_uint32),

+ ('length', c_uint32),

+ ('version', c_uint16),

+ ('svn', c_uint8),

+ ('attribute', c_uint8)

+ ]

+ _compress_alg = {

+ b'LZDM': 'Dummy',

+ b'LZ4 ': 'Lz4',

+ b'LZMA': 'Lzma',

+ }

+

+

+def print_bytes(data, indent=0, offset=0, show_ascii=False):

+ bytes_per_line = 16

+ printable = ' ' + string.ascii_letters + string.digits + string.punctuation

+ str_fmt = '{:s}{:04x}: {:%ds} {:s}' % (bytes_per_line * 3)

+ bytes_per_line

+ data_array = bytearray(data)

+ for idx in range(0, len(data_array), bytes_per_line):

+ hex_str = ' '.join(

+ '%02X' % val for val in data_array[idx:idx + bytes_per_line])

+ asc_str = ''.join('%c' % (val if (chr(val) in printable) else '.')

+ for val in data_array[idx:idx + bytes_per_line])

+ print(str_fmt.format(

+ indent * ' ',

+ offset + idx, hex_str,

+ ' ' + asc_str if show_ascii else ''))

+

+

+def get_bits_from_bytes(bytes, start, length):

+ if length == 0:

+ return 0

+ byte_start = (start) // 8

+ byte_end = (start + length - 1) // 8

+ bit_start = start & 7

+ mask = (1 << length) - 1

+ val = bytes_to_value(bytes[byte_start:byte_end + 1])

+ val = (val >> bit_start) & mask

+ return val

+

+

+def set_bits_to_bytes(bytes, start, length, bvalue):

+ if length == 0:

+ return

+ byte_start = (start) // 8

+ byte_end = (start + length - 1) // 8

+ bit_start = start & 7

+ mask = (1 << length) - 1

+ val = bytes_to_value(bytes[byte_start:byte_end + 1])

+ val &= ~(mask << bit_start)

+ val |= ((bvalue & mask) << bit_start)

+ bytes[byte_start:byte_end+1] = value_to_bytearray(

+ val,

+ byte_end + 1 - byte_start)

+

+

+def value_to_bytes(value, length):

+ return value.to_bytes(length, 'little')

+

+

+def bytes_to_value(bytes):

+ return int.from_bytes(bytes, 'little')

+

+

+def value_to_bytearray(value, length):

+ return bytearray(value_to_bytes(value, length))

+

+# def value_to_bytearray (value, length):

+ return bytearray(value_to_bytes(value, length))

+

+

+def get_aligned_value(value, alignment=4):

+ if alignment != (1 << (alignment.bit_length() - 1)):

+ raise Exception(

+ 'Alignment (0x%x) should to be power of 2 !' % alignment)

+ value = (value + (alignment - 1)) & ~(alignment - 1)

+ return value

+

+

+def get_padding_length(data_len, alignment=4):

+ new_data_len = get_aligned_value(data_len, alignment)

+ return new_data_len - data_len

+

+

+def get_file_data(file, mode='rb'):

+ return open(file, mode).read()

+

+

+def gen_file_from_object(file, object):

+ open(file, 'wb').write(object)

+

+

+def gen_file_with_size(file, size):

+ open(file, 'wb').write(b'\xFF' * size)

+

+

+def check_files_exist(base_name_list, dir='', ext=''):

+ for each in base_name_list:

+ if not os.path.exists(os.path.join(dir, each + ext)):

+ return False

+ return True

+

+

+def load_source(name, filepath):

+ mod = SourceFileLoader(name, filepath).load_module()

+ return mod

+

+

+def get_openssl_path():

+ if os.name == 'nt':

+ if 'OPENSSL_PATH' not in os.environ:

+ openssl_dir = "C:\\Openssl\\bin\\"

+ if os.path.exists(openssl_dir):

+ os.environ['OPENSSL_PATH'] = openssl_dir

+ else:

+ os.environ['OPENSSL_PATH'] = "C:\\Openssl\\"

+ if 'OPENSSL_CONF' not in os.environ:

+ openssl_cfg = "C:\\Openssl\\openssl.cfg"

+ if os.path.exists(openssl_cfg):

+ os.environ['OPENSSL_CONF'] = openssl_cfg

+ openssl = os.path.join(

+ os.environ.get('OPENSSL_PATH', ''),

+ 'openssl.exe')

+ else:

+ # Get openssl path for Linux cases

+ openssl = shutil.which('openssl')

+

+ return openssl

+

+

+def run_process(arg_list, print_cmd=False, capture_out=False):

+ sys.stdout.flush()

+ if os.name == 'nt' and os.path.splitext(arg_list[0])[1] == '' and \

+ os.path.exists(arg_list[0] + '.exe'):

+ arg_list[0] += '.exe'

+ if print_cmd:

+ print(' '.join(arg_list))

+

+ exc = None

+ result = 0

+ output = ''

+ try:

+ if capture_out:

+ output = subprocess.check_output(arg_list).decode()

+ else:

+ result = subprocess.call(arg_list)

+ except Exception as ex:

+ result = 1

+ exc = ex

+

+ if result:

+ if not print_cmd:

+ print('Error in running process:\n %s' % ' '.join(arg_list))

+ if exc is None:

+ sys.exit(1)

+ else:

+ raise exc

+

+ return output

+

+

+# Adjust hash type algorithm based on Public key file

+def adjust_hash_type(pub_key_file):

+ key_type = get_key_type(pub_key_file)

+ if key_type == 'RSA2048':

+ hash_type = 'SHA2_256'

+ elif key_type == 'RSA3072':

+ hash_type = 'SHA2_384'

+ else:

+ hash_type = None

+

+ return hash_type

+

+

+def rsa_sign_file(

+ priv_key, pub_key, hash_type, sign_scheme,

+ in_file, out_file, inc_dat=False, inc_key=False):

+

+ bins = bytearray()

+ if inc_dat:

+ bins.extend(get_file_data(in_file))

+

+

+# def single_sign_file(priv_key, hash_type, sign_scheme, in_file, out_file):

+

+ out_data = get_file_data(out_file)

+

+ sign = SIGNATURE_HDR()

+ sign.SigSize = len(out_data)

+ sign.SigType = SIGN_TYPE_SCHEME[sign_scheme]

+ sign.HashAlg = HASH_TYPE_VALUE[hash_type]

+

+ bins.extend(bytearray(sign) + out_data)

+ if inc_key:

+ key = gen_pub_key(priv_key, pub_key)

+ bins.extend(key)

+

+ if len(bins) != len(out_data):

+ gen_file_from_object(out_file, bins)

+

+

+def get_key_type(in_key):

+

+ # Check in_key is file or key Id

+ if not os.path.exists(in_key):

+ key = bytearray(gen_pub_key(in_key))

+ else:

+ # Check for public key in binary format.

+ key = bytearray(get_file_data(in_key))

+

+ pub_key_hdr = PUB_KEY_HDR.from_buffer(key)

+ if pub_key_hdr.Identifier != b'PUBK':

+ pub_key = gen_pub_key(in_key)

+ pub_key_hdr = PUB_KEY_HDR.from_buffer(pub_key)

+

+ key_type = next(

+ (key for key,

+ value in PUB_KEY_TYPE.items() if value == pub_key_hdr.KeyType))

+ return '%s%d' % (key_type, (pub_key_hdr.KeySize - 4) * 8)

+

+

+def get_auth_hash_type(key_type, sign_scheme):

+ if key_type == "RSA2048" and sign_scheme == "RSA_PKCS1":

+ hash_type = 'SHA2_256'

+ auth_type = 'RSA2048_PKCS1_SHA2_256'

+ elif key_type == "RSA3072" and sign_scheme == "RSA_PKCS1":

+ hash_type = 'SHA2_384'

+ auth_type = 'RSA3072_PKCS1_SHA2_384'

+ elif key_type == "RSA2048" and sign_scheme == "RSA_PSS":

+ hash_type = 'SHA2_256'

+ auth_type = 'RSA2048_PSS_SHA2_256'

+ elif key_type == "RSA3072" and sign_scheme == "RSA_PSS":

+ hash_type = 'SHA2_384'

+ auth_type = 'RSA3072_PSS_SHA2_384'

+ else:

+ hash_type = ''

+ auth_type = ''

+ return auth_type, hash_type

+

+

+# def single_sign_gen_pub_key(in_key, pub_key_file=None):

+

+

+def gen_pub_key(in_key, pub_key=None):

+

+ keydata = single_sign_gen_pub_key(in_key, pub_key)

+

+ publickey = PUB_KEY_HDR()

+ publickey.KeySize = len(keydata)

+ publickey.KeyType = PUB_KEY_TYPE['RSA']

+

+ key = bytearray(publickey) + keydata

+

+ if pub_key:

+ gen_file_from_object(pub_key, key)

+

+ return key

+

+

+def decompress(in_file, out_file, tool_dir=''):

+ if not os.path.isfile(in_file):

+ raise Exception("Invalid input file '%s' !" % in_file)

+

+ # Remove the Lz Header

+ fi = open(in_file, 'rb')

+ di = bytearray(fi.read())

+ fi.close()

+

+ lz_hdr = LZ_HEADER.from_buffer(di)

+ offset = sizeof(lz_hdr)

+ if lz_hdr.signature == b"LZDM" or lz_hdr.compressed_len == 0:

+ fo = open(out_file, 'wb')

+ fo.write(di[offset:offset + lz_hdr.compressed_len])

+ fo.close()

+ return

+

+ temp = os.path.splitext(out_file)[0] + '.tmp'

+ if lz_hdr.signature == b"LZMA":

+ alg = "Lzma"

+ elif lz_hdr.signature == b"LZ4 ":

+ alg = "Lz4"

+ else:

+ raise Exception("Unsupported compression '%s' !" % lz_hdr.signature)

+

+ fo = open(temp, 'wb')

+ fo.write(di[offset:offset + lz_hdr.compressed_len])

+ fo.close()

+

+ compress_tool = "%sCompress" % alg

+ if alg == "Lz4":

+ try:

+ cmdline = [

+ os.path.join(tool_dir, compress_tool),

+ "-d",

+ "-o", out_file,

+ temp]

+ run_process(cmdline, False, True)

+ except Exception:

+ msg_string = "Could not find/use CompressLz4 tool, " \

+ "trying with python lz4..."

+ print(msg_string)

+ try:

+ import lz4.block

+ if lz4.VERSION != '3.1.1':

+ msg_string = "Recommended lz4 module version " \

+ "is '3.1.1'," + lz4.VERSION \

+ + " is currently installed."

+ print(msg_string)

+ except ImportError:

+ msg_string = "Could not import lz4, use " \

+ "'python -m pip install lz4==3.1.1' " \

+ "to install it."

+ print(msg_string)

+ exit(1)

+ decompress_data = lz4.block.decompress(get_file_data(temp))

+ with open(out_file, "wb") as lz4bin:

+ lz4bin.write(decompress_data)

+ else:

+ cmdline = [

+ os.path.join(tool_dir, compress_tool),

+ "-d",

+ "-o", out_file,

+ temp]

+ run_process(cmdline, False, True)

+ os.remove(temp)

+

+

+def compress(in_file, alg, svn=0, out_path='', tool_dir=''):

+ if not os.path.isfile(in_file):

+ raise Exception("Invalid input file '%s' !" % in_file)

+

+ basename, ext = os.path.splitext(os.path.basename(in_file))

+ if out_path:

+ if os.path.isdir(out_path):

+ out_file = os.path.join(out_path, basename + '.lz')

+ else:

+ out_file = os.path.join(out_path)

+ else:

+ out_file = os.path.splitext(in_file)[0] + '.lz'

+

+ if alg == "Lzma":

+ sig = "LZMA"

+ elif alg == "Tiano":

+ sig = "LZUF"

+ elif alg == "Lz4":

+ sig = "LZ4 "

+ elif alg == "Dummy":

+ sig = "LZDM"

+ else:

+ raise Exception("Unsupported compression '%s' !" % alg)

+

+ in_len = os.path.getsize(in_file)

+ if in_len > 0:

+ compress_tool = "%sCompress" % alg

+ if sig == "LZDM":

+ shutil.copy(in_file, out_file)

+ compress_data = get_file_data(out_file)

+ elif sig == "LZ4 ":

+ try:

+ cmdline = [

+ os.path.join(tool_dir, compress_tool),

+ "-e",

+ "-o", out_file,

+ in_file]

+ run_process(cmdline, False, True)

+ compress_data = get_file_data(out_file)

+ except Exception:

+ msg_string = "Could not find/use CompressLz4 tool, " \

+ "trying with python lz4..."

+ print(msg_string)

+ try:

+ import lz4.block

+ if lz4.VERSION != '3.1.1':

+ msg_string = "Recommended lz4 module version " \

+ "is '3.1.1', " + lz4.VERSION \

+ + " is currently installed."

+ print(msg_string)

+ except ImportError:

+ msg_string = "Could not import lz4, use " \

+ "'python -m pip install lz4==3.1.1' " \

+ "to install it."

+ print(msg_string)

+ exit(1)

+ compress_data = lz4.block.compress(

+ get_file_data(in_file),

+ mode='high_compression')

+ elif sig == "LZMA":

+ cmdline = [

+ os.path.join(tool_dir, compress_tool),

+ "-e",

+ "-o", out_file,

+ in_file]

+ run_process(cmdline, False, True)

+ compress_data = get_file_data(out_file)

+ else:

+ compress_data = bytearray()

+

+ lz_hdr = LZ_HEADER()

+ lz_hdr.signature = sig.encode()

+ lz_hdr.svn = svn

+ lz_hdr.compressed_len = len(compress_data)

+ lz_hdr.length = os.path.getsize(in_file)

+ data = bytearray()

+ data.extend(lz_hdr)

+ data.extend(compress_data)

+ gen_file_from_object(out_file, data)

+

+ return out_file

diff --git a/IntelFsp2Pkg/Tools/ConfigEditor/ConfigEditor.py
b/IntelFsp2Pkg/Tools/ConfigEditor/ConfigEditor.py
new file mode 100644
index 0000000000..a7f79bbc96
--- /dev/null
+++ b/IntelFsp2Pkg/Tools/ConfigEditor/ConfigEditor.py
@@ -0,0 +1,1499 @@
+# @ ConfigEditor.py

+#

+# Copyright(c) 2018 - 2021, Intel Corporation. All rights reserved.<BR>

+# SPDX-License-Identifier: BSD-2-Clause-Patent

+#

+##

+

+import os

+import sys

+import marshal

+import tkinter

+import tkinter.ttk as ttk

+import tkinter.messagebox as messagebox

+import tkinter.filedialog as filedialog

+

+from pathlib import Path

+from GenYamlCfg import CGenYamlCfg, bytes_to_value, \

+ bytes_to_bracket_str, value_to_bytes, array_str_to_value

+from ctypes import sizeof, Structure, ARRAY, c_uint8, c_uint64, c_char, \

+ c_uint32, c_uint16

+from functools import reduce

+

+sys.path.insert(0, '..')

+from FspDscBsf2Yaml import bsf_to_dsc, dsc_to_yaml # noqa

+

+

+sys.dont_write_bytecode = True

+

+

+class create_tool_tip(object):

+ '''

+ create a tooltip for a given widget

+ '''

+ in_progress = False

+

+ def __init__(self, widget, text=''):

+ self.top_win = None

+ self.widget = widget

+ self.text = text

+ self.widget.bind("<Enter>", self.enter)

+ self.widget.bind("<Leave>", self.leave)

+

+ def enter(self, event=None):

+ if self.in_progress:

+ return

+ if self.widget.winfo_class() == 'Treeview':

+ # Only show help when cursor is on row header.

+ rowid = self.widget.identify_row(event.y)

+ if rowid != '':

+ return

+ else:

+ x, y, cx, cy = self.widget.bbox("insert")

+

+ cursor = self.widget.winfo_pointerxy()

+ x = self.widget.winfo_rootx() + 35

+ y = self.widget.winfo_rooty() + 20

+ if cursor[1] > y and cursor[1] < y + 20:

+ y += 20

+

+ # creates a toplevel window

+ self.top_win = tkinter.Toplevel(self.widget)

+ # Leaves only the label and removes the app window

+ self.top_win.wm_overrideredirect(True)

+ self.top_win.wm_geometry("+%d+%d" % (x, y))

+ label = tkinter.Message(self.top_win,

+ text=self.text,

+ justify='left',

+ background='bisque',

+ relief='solid',

+ borderwidth=1,

+ font=("times", "10", "normal"))

+ label.pack(ipadx=1)

+ self.in_progress = True

+

+ def leave(self, event=None):

+ if self.top_win:

+ self.top_win.destroy()

+ self.in_progress = False

+

+

+class validating_entry(tkinter.Entry):

+ def __init__(self, master, **kw):

+ tkinter.Entry.__init__(*(self, master), **kw)

+ self.parent = master

+ self.old_value = ''

+ self.last_value = ''

+ self.variable = tkinter.StringVar()

+ self.variable.trace("w", self.callback)

+ self.config(textvariable=self.variable)

+ self.config({"background": "#c0c0c0"})

+ self.bind("<Return>", self.move_next)

+ self.bind("<Tab>", self.move_next)

+ self.bind("<Escape>", self.cancel)

+ for each in ['BackSpace', 'Delete']:

+ self.bind("<%s>" % each, self.ignore)

+ self.display(None)

+

+ def ignore(self, even):

+ return "break"

+

+ def move_next(self, event):

+ if self.row < 0:

+ return

+ row, col = self.row, self.col

+ txt, row_id, col_id = self.parent.get_next_cell(row, col)

+ self.display(txt, row_id, col_id)

+ return "break"

+

+ def cancel(self, event):

+ self.variable.set(self.old_value)

+ self.display(None)

+

+ def display(self, txt, row_id='', col_id=''):

+ if txt is None:

+ self.row = -1

+ self.col = -1

+ self.place_forget()

+ else:

+ row = int('0x' + row_id[1:], 0) - 1

+ col = int(col_id[1:]) - 1

+ self.row = row

+ self.col = col

+ self.old_value = txt

+ self.last_value = txt

+ x, y, width, height = self.parent.bbox(row_id, col)

+ self.place(x=x, y=y, w=width)

+ self.variable.set(txt)

+ self.focus_set()

+ self.icursor(0)

+

+ def callback(self, *Args):

+ cur_val = self.variable.get()

+ new_val = self.validate(cur_val)

+ if new_val is not None and self.row >= 0:

+ self.last_value = new_val

+ self.parent.set_cell(self.row, self.col, new_val)

+ self.variable.set(self.last_value)

+

+ def validate(self, value):

+ if len(value) > 0:

+ try:

+ int(value, 16)

+ except Exception:

+ return None

+

+ # Normalize the cell format

+ self.update()

+ cell_width = self.winfo_width()

+ max_len = custom_table.to_byte_length(cell_width) * 2

+ cur_pos = self.index("insert")

+ if cur_pos == max_len + 1:

+ value = value[-max_len:]

+ else:

+ value = value[:max_len]

+ if value == '':

+ value = '0'

+ fmt = '%%0%dX' % max_len

+ return fmt % int(value, 16)

+

+

+class custom_table(ttk.Treeview):

+ _Padding = 20

+ _Char_width = 6

+

+ def __init__(self, parent, col_hdr, bins):

+ cols = len(col_hdr)

+

+ col_byte_len = []

+ for col in range(cols): # Columns

+ col_byte_len.append(int(col_hdr[col].split(':')[1]))

+

+ byte_len = sum(col_byte_len)

+ rows = (len(bins) + byte_len - 1) // byte_len

+

+ self.rows = rows

+ self.cols = cols

+ self.col_byte_len = col_byte_len

+ self.col_hdr = col_hdr

+

+ self.size = len(bins)

+ self.last_dir = ''

+

+ style = ttk.Style()

+ style.configure("Custom.Treeview.Heading",

+ font=('calibri', 10, 'bold'),

+ foreground="blue")

+ ttk.Treeview.__init__(self, parent, height=rows,

+ columns=[''] + col_hdr, show='headings',

+ style="Custom.Treeview",

+ selectmode='none')

+ self.bind("<Button-1>", self.click)

+ self.bind("<FocusOut>", self.focus_out)

+ self.entry = validating_entry(self, width=4, justify=tkinter.CENTER)

+

+ self.heading(0, text='LOAD')

+ self.column(0, width=60, stretch=0, anchor=tkinter.CENTER)

+

+ for col in range(cols): # Columns

+ text = col_hdr[col].split(':')[0]

+ byte_len = int(col_hdr[col].split(':')[1])

+ self.heading(col+1, text=text)

+ self.column(col+1, width=self.to_cell_width(byte_len),

+ stretch=0, anchor=tkinter.CENTER)

+ idx = 0

+ for row in range(rows): # Rows

+ text = '%04X' % (row * len(col_hdr))

+ vals = ['%04X:' % (cols * row)]

+ for col in range(cols): # Columns

+ if idx >= len(bins):

+ break

+ byte_len = int(col_hdr[col].split(':')[1])

+ value = bytes_to_value(bins[idx:idx+byte_len])

+ hex = ("%%0%dX" % (byte_len * 2)) % value

+ vals.append(hex)

+ idx += byte_len

+ self.insert('', 'end', values=tuple(vals))

+ if idx >= len(bins):

+ break

+

+ @staticmethod

+ def to_cell_width(byte_len):

+ return byte_len * 2 * custom_table._Char_width + custom_table._Padding

+

+ @staticmethod

+ def to_byte_length(cell_width):

+ return(cell_width - custom_table._Padding) \

+ // (2 * custom_table._Char_width)

+

+ def focus_out(self, event):

+ self.entry.display(None)

+

+ def refresh_bin(self, bins):

+ if not bins:

+ return

+

+ # Reload binary into widget

+ bin_len = len(bins)

+ for row in range(self.rows):

+ iid = self.get_children()[row]

+ for col in range(self.cols):

+ idx = row * sum(self.col_byte_len) + \

+ sum(self.col_byte_len[:col])

+ byte_len = self.col_byte_len[col]

+ if idx + byte_len <= self.size:

+ byte_len = int(self.col_hdr[col].split(':')[1])

+ if idx + byte_len > bin_len:

+ val = 0

+ else:

+ val = bytes_to_value(bins[idx:idx+byte_len])

+ hex_val = ("%%0%dX" % (byte_len * 2)) % val

+ self.set(iid, col + 1, hex_val)

+

+ def get_cell(self, row, col):

+ iid = self.get_children()[row]

+ txt = self.item(iid, 'values')[col]

+ return txt

+

+ def get_next_cell(self, row, col):

+ rows = self.get_children()

+ col += 1

+ if col > self.cols:

+ col = 1

+ row += 1

+ cnt = row * sum(self.col_byte_len) + sum(self.col_byte_len[:col])

+ if cnt > self.size:

+ # Reached the last cell, so roll back to beginning

+ row = 0

+ col = 1

+

+ txt = self.get_cell(row, col)

+ row_id = rows[row]

+ col_id = '#%d' % (col + 1)

+ return(txt, row_id, col_id)

+

+ def set_cell(self, row, col, val):

+ iid = self.get_children()[row]

+ self.set(iid, col, val)

+

+ def load_bin(self):

+ # Load binary from file

+ path = filedialog.askopenfilename(

+ initialdir=self.last_dir,

+ title="Load binary file",

+ filetypes=(("Binary files", "*.bin"), (

+ "binary files", "*.bin")))

+ if path:

+ self.last_dir = os.path.dirname(path)

+ fd = open(path, 'rb')

+ bins = bytearray(fd.read())[:self.size]

+ fd.close()

+ bins.extend(b'\x00' * (self.size - len(bins)))

+ return bins

+

+ return None

+

+ def click(self, event):

+ row_id = self.identify_row(event.y)

+ col_id = self.identify_column(event.x)

+ if row_id == '' and col_id == '#1':

+ # Clicked on "LOAD" cell

+ bins = self.load_bin()

+ self.refresh_bin(bins)

+ return

+

+ if col_id == '#1':

+ # Clicked on column 1(Offset column)

+ return

+

+ item = self.identify('item', event.x, event.y)

+ if not item or not col_id:

+ # Not clicked on valid cell

+ return

+

+ # Clicked cell

+ row = int('0x' + row_id[1:], 0) - 1

+ col = int(col_id[1:]) - 1

+ if row * self.cols + col > self.size:

+ return

+

+ vals = self.item(item, 'values')

+ if col < len(vals):

+ txt = self.item(item, 'values')[col]

+ self.entry.display(txt, row_id, col_id)

+

+ def get(self):

+ bins = bytearray()

+ row_ids = self.get_children()

+ for row_id in row_ids:

+ row = int('0x' + row_id[1:], 0) - 1

+ for col in range(self.cols):

+ idx = row * sum(self.col_byte_len) + \

+ sum(self.col_byte_len[:col])

+ byte_len = self.col_byte_len[col]

+ if idx + byte_len > self.size:

+ break

+ hex = self.item(row_id, 'values')[col + 1]

+ values = value_to_bytes(int(hex, 16)

+ & ((1 << byte_len * 8) - 1), byte_len)

+ bins.extend(values)

+ return bins

+

+

+class c_uint24(Structure):

+ """Little-Endian 24-bit Unsigned Integer"""

+ _pack_ = 1

+ _fields_ = [('Data', (c_uint8 * 3))]

+

+ def __init__(self, val=0):

+ self.set_value(val)

+

+ def __str__(self, indent=0):

+ return '0x%.6x' % self.value

+

+ def __int__(self):

+ return self.get_value()

+

+ def set_value(self, val):

+ self.Data[0:3] = Val2Bytes(val, 3)

+

+ def get_value(self):

+ return Bytes2Val(self.Data[0:3])

+

+ value = property(get_value, set_value)

+

+

+class EFI_FIRMWARE_VOLUME_HEADER(Structure):

+ _fields_ = [

+ ('ZeroVector', ARRAY(c_uint8, 16)),

+ ('FileSystemGuid', ARRAY(c_uint8, 16)),

+ ('FvLength', c_uint64),

+ ('Signature', ARRAY(c_char, 4)),

+ ('Attributes', c_uint32),

+ ('HeaderLength', c_uint16),

+ ('Checksum', c_uint16),

+ ('ExtHeaderOffset', c_uint16),

+ ('Reserved', c_uint8),

+ ('Revision', c_uint8)

+ ]

+

+

+class EFI_FIRMWARE_VOLUME_EXT_HEADER(Structure):

+ _fields_ = [

+ ('FvName', ARRAY(c_uint8, 16)),

+ ('ExtHeaderSize', c_uint32)

+ ]

+

+

+class EFI_FFS_INTEGRITY_CHECK(Structure):

+ _fields_ = [

+ ('Header', c_uint8),

+ ('File', c_uint8)

+ ]

+

+

+class EFI_FFS_FILE_HEADER(Structure):

+ _fields_ = [

+ ('Name', ARRAY(c_uint8, 16)),

+ ('IntegrityCheck', EFI_FFS_INTEGRITY_CHECK),

+ ('Type', c_uint8),

+ ('Attributes', c_uint8),

+ ('Size', c_uint24),

+ ('State', c_uint8)

+ ]

+

+

+class EFI_COMMON_SECTION_HEADER(Structure):

+ _fields_ = [

+ ('Size', c_uint24),

+ ('Type', c_uint8)

+ ]

+

+

+class EFI_SECTION_TYPE:

+ """Enumeration of all valid firmware file section types."""

+ ALL = 0x00

+ COMPRESSION = 0x01

+ GUID_DEFINED = 0x02

+ DISPOSABLE = 0x03

+ PE32 = 0x10

+ PIC = 0x11

+ TE = 0x12

+ DXE_DEPEX = 0x13

+ VERSION = 0x14

+ USER_INTERFACE = 0x15

+ COMPATIBILITY16 = 0x16

+ FIRMWARE_VOLUME_IMAGE = 0x17

+ FREEFORM_SUBTYPE_GUID = 0x18

+ RAW = 0x19

+ PEI_DEPEX = 0x1b

+ SMM_DEPEX = 0x1c

+

+

+class FSP_COMMON_HEADER(Structure):

+ _fields_ = [

+ ('Signature', ARRAY(c_char, 4)),

+ ('HeaderLength', c_uint32)

+ ]

+

+

+class FSP_INFORMATION_HEADER(Structure):

+ _fields_ = [

+ ('Signature', ARRAY(c_char, 4)),

+ ('HeaderLength', c_uint32),

+ ('Reserved1', c_uint16),

+ ('SpecVersion', c_uint8),

+ ('HeaderRevision', c_uint8),

+ ('ImageRevision', c_uint32),

+ ('ImageId', ARRAY(c_char, 8)),

+ ('ImageSize', c_uint32),

+ ('ImageBase', c_uint32),

+ ('ImageAttribute', c_uint16),

+ ('ComponentAttribute', c_uint16),

+ ('CfgRegionOffset', c_uint32),

+ ('CfgRegionSize', c_uint32),

+ ('Reserved2', c_uint32),

+ ('TempRamInitEntryOffset', c_uint32),

+ ('Reserved3', c_uint32),

+ ('NotifyPhaseEntryOffset', c_uint32),

+ ('FspMemoryInitEntryOffset', c_uint32),

+ ('TempRamExitEntryOffset', c_uint32),

+ ('FspSiliconInitEntryOffset', c_uint32)

+ ]

+

+

+class FSP_EXTENDED_HEADER(Structure):

+ _fields_ = [

+ ('Signature', ARRAY(c_char, 4)),

+ ('HeaderLength', c_uint32),

+ ('Revision', c_uint8),

+ ('Reserved', c_uint8),

+ ('FspProducerId', ARRAY(c_char, 6)),

+ ('FspProducerRevision', c_uint32),

+ ('FspProducerDataSize', c_uint32)

+ ]

+

+

+class FSP_PATCH_TABLE(Structure):

+ _fields_ = [

+ ('Signature', ARRAY(c_char, 4)),

+ ('HeaderLength', c_uint16),

+ ('HeaderRevision', c_uint8),

+ ('Reserved', c_uint8),

+ ('PatchEntryNum', c_uint32)

+ ]

+

+

+class Section:

+ def __init__(self, offset, secdata):

+ self.SecHdr = EFI_COMMON_SECTION_HEADER.from_buffer(secdata, 0)

+ self.SecData = secdata[0:int(self.SecHdr.Size)]

+ self.Offset = offset

+

+

+def AlignPtr(offset, alignment=8):

+ return (offset + alignment - 1) & ~(alignment - 1)

+

+

+def Bytes2Val(bytes):

+ return reduce(lambda x, y: (x << 8) | y, bytes[:: -1])

+

+

+def Val2Bytes(value, blen):

+ return [(value >> (i*8) & 0xff) for i in range(blen)]

+

+

+class FirmwareFile:

+ def __init__(self, offset, filedata):

+ self.FfsHdr = EFI_FFS_FILE_HEADER.from_buffer(filedata, 0)

+ self.FfsData = filedata[0:int(self.FfsHdr.Size)]

+ self.Offset = offset

+ self.SecList = []

+

+ def ParseFfs(self):

+ ffssize = len(self.FfsData)

+ offset = sizeof(self.FfsHdr)

+ if self.FfsHdr.Name != '\xff' * 16:

+ while offset < (ffssize - sizeof(EFI_COMMON_SECTION_HEADER)):

+ sechdr = EFI_COMMON_SECTION_HEADER.from_buffer(

+ self.FfsData, offset)

+ sec = Section(

+ offset, self.FfsData[offset:offset + int(sechdr.Size)])

+ self.SecList.append(sec)

+ offset += int(sechdr.Size)

+ offset = AlignPtr(offset, 4)

+

+

+class FirmwareVolume:

+ def __init__(self, offset, fvdata):

+ self.FvHdr = EFI_FIRMWARE_VOLUME_HEADER.from_buffer(fvdata, 0)

+ self.FvData = fvdata[0: self.FvHdr.FvLength]

+ self.Offset = offset

+ if self.FvHdr.ExtHeaderOffset > 0:

+ self.FvExtHdr = EFI_FIRMWARE_VOLUME_EXT_HEADER.from_buffer(

+ self.FvData, self.FvHdr.ExtHeaderOffset)

+ else:

+ self.FvExtHdr = None

+ self.FfsList = []

+

+ def ParseFv(self):

+ fvsize = len(self.FvData)

+ if self.FvExtHdr:

+ offset = self.FvHdr.ExtHeaderOffset + self.FvExtHdr.ExtHeaderSize

+ else:

+ offset = self.FvHdr.HeaderLength

+ offset = AlignPtr(offset)

+ while offset < (fvsize - sizeof(EFI_FFS_FILE_HEADER)):

+ ffshdr = EFI_FFS_FILE_HEADER.from_buffer(self.FvData, offset)

+ if (ffshdr.Name == '\xff' * 16) and \

+ (int(ffshdr.Size) == 0xFFFFFF):

+ offset = fvsize

+ else:

+ ffs = FirmwareFile(

+ offset, self.FvData[offset:offset + int(ffshdr.Size)])

+ ffs.ParseFfs()

+ self.FfsList.append(ffs)

+ offset += int(ffshdr.Size)

+ offset = AlignPtr(offset)

+

+

+class FspImage:

+ def __init__(self, offset, fih, fihoff, patch):

+ self.Fih = fih

+ self.FihOffset = fihoff

+ self.Offset = offset

+ self.FvIdxList = []

+ self.Type = "XTMSXXXXOXXXXXXX"[(fih.ComponentAttribute >> 12) & 0x0F]

+ self.PatchList = patch

+ self.PatchList.append(fihoff + 0x1C)

+

+ def AppendFv(self, FvIdx):

+ self.FvIdxList.append(FvIdx)

+

+ def Patch(self, delta, fdbin):

+ count = 0

+ applied = 0

+ for idx, patch in enumerate(self.PatchList):

+ ptype = (patch >> 24) & 0x0F

+ if ptype not in [0x00, 0x0F]:

+ raise Exception('ERROR: Invalid patch type %d !' % ptype)

+ if patch & 0x80000000:

+ patch = self.Fih.ImageSize - (0x1000000 - (patch & 0xFFFFFF))

+ else:

+ patch = patch & 0xFFFFFF

+ if (patch < self.Fih.ImageSize) and \

+ (patch + sizeof(c_uint32) <= self.Fih.ImageSize):

+ offset = patch + self.Offset

+ value = Bytes2Val(fdbin[offset:offset+sizeof(c_uint32)])

+ value += delta

+ fdbin[offset:offset+sizeof(c_uint32)] = Val2Bytes(

+ value, sizeof(c_uint32))

+ applied += 1

+ count += 1

+ # Don't count the FSP base address patch entry appended at the end

+ if count != 0:

+ count -= 1

+ applied -= 1

+ return (count, applied)

+

+

+class FirmwareDevice:

+ def __init__(self, offset, FdData):

+ self.FvList = []

+ self.FspList = []

+ self.FspExtList = []

+ self.FihList = []

+ self.BuildList = []

+ self.OutputText = ""

+ self.Offset = 0

+ self.FdData = FdData

+

+ def ParseFd(self):

+ offset = 0

+ fdsize = len(self.FdData)

+ self.FvList = []

+ while offset < (fdsize - sizeof(EFI_FIRMWARE_VOLUME_HEADER)):

+ fvh = EFI_FIRMWARE_VOLUME_HEADER.from_buffer(self.FdData,
offset)

+ if b'_FVH' != fvh.Signature:

+ raise Exception("ERROR: Invalid FV header !")

+ fv = FirmwareVolume(

+ offset, self.FdData[offset:offset + fvh.FvLength])

+ fv.ParseFv()

+ self.FvList.append(fv)

+ offset += fv.FvHdr.FvLength

+

+ def CheckFsp(self):

+ if len(self.FspList) == 0:

+ return

+

+ fih = None

+ for fsp in self.FspList:

+ if not fih:

+ fih = fsp.Fih

+ else:

+ newfih = fsp.Fih

+ if (newfih.ImageId != fih.ImageId) or \

+ (newfih.ImageRevision != fih.ImageRevision):

+ raise Exception(

+ "ERROR: Inconsistent FSP ImageId or "

+ "ImageRevision detected !")

+

+ def ParseFsp(self):

+ flen = 0

+ for idx, fv in enumerate(self.FvList):

+ # Check if this FV contains FSP header

+ if flen == 0:

+ if len(fv.FfsList) == 0:

+ continue

+ ffs = fv.FfsList[0]

+ if len(ffs.SecList) == 0:

+ continue

+ sec = ffs.SecList[0]

+ if sec.SecHdr.Type != EFI_SECTION_TYPE.RAW:

+ continue

+ fihoffset = ffs.Offset + sec.Offset + sizeof(sec.SecHdr)

+ fspoffset = fv.Offset

+ offset = fspoffset + fihoffset

+ fih = FSP_INFORMATION_HEADER.from_buffer(self.FdData, offset)

+ self.FihList.append(fih)

+ if b'FSPH' != fih.Signature:

+ continue

+

+ offset += fih.HeaderLength

+

+ offset = AlignPtr(offset, 2)

+ Extfih = FSP_EXTENDED_HEADER.from_buffer(self.FdData, offset)

+ self.FspExtList.append(Extfih)

+ offset = AlignPtr(offset, 4)

+ plist = []

+ while True:

+ fch = FSP_COMMON_HEADER.from_buffer(self.FdData, offset)

+ if b'FSPP' != fch.Signature:

+ offset += fch.HeaderLength

+ offset = AlignPtr(offset, 4)

+ else:

+ fspp = FSP_PATCH_TABLE.from_buffer(

+ self.FdData, offset)

+ offset += sizeof(fspp)

+ start_offset = offset + 32

+ end_offset = offset + 32

+ while True:

+ end_offset += 1

+ if(self.FdData[

+ end_offset: end_offset + 1] == b'\xff'):

+ break

+ self.BuildList.append(

+ self.FdData[start_offset:end_offset])

+ pdata = (c_uint32 * fspp.PatchEntryNum).from_buffer(

+ self.FdData, offset)

+ plist = list(pdata)

+ break

+

+ fsp = FspImage(fspoffset, fih, fihoffset, plist)

+ fsp.AppendFv(idx)

+ self.FspList.append(fsp)

+ flen = fsp.Fih.ImageSize - fv.FvHdr.FvLength

+ else:

+ fsp.AppendFv(idx)

+ flen -= fv.FvHdr.FvLength

+ if flen < 0:

+ raise Exception("ERROR: Incorrect FV size in image !")

+ self.CheckFsp()

+

+ def OutputFsp(self):

+ def copy_text_to_clipboard():

+ window.clipboard_clear()

+ window.clipboard_append(self.OutputText)

+

+ window = tkinter.Tk()

+ window.title("Fsp Headers")

+ window.resizable(0, 0)

+ # Window Size

+ window.geometry("300x400+350+150")

+ frame = tkinter.Frame(window)

+ frame.pack(side=tkinter.BOTTOM)

+ # Vertical (y) Scroll Bar

+ scroll = tkinter.Scrollbar(window)

+ scroll.pack(side=tkinter.RIGHT, fill=tkinter.Y)

+ text = tkinter.Text(window,

+ wrap=tkinter.NONE, yscrollcommand=scroll.set)

+ i = 0

+ self.OutputText = self.OutputText + "Fsp Header Details \n\n"

+ while i < len(self.FihList):

+ try:

+ self.OutputText += str(self.BuildList[i].decode()) + "\n"

+ except Exception:

+ self.OutputText += "No description found\n"

+ self.OutputText += "FSP Header :\n "

+ self.OutputText += "Signature : " + \

+ str(self.FihList[i].Signature.decode('utf-8')) + "\n "

+ self.OutputText += "Header Length : " + \

+ str(hex(self.FihList[i].HeaderLength)) + "\n "

+ self.OutputText += "Header Revision : " + \

+ str(hex(self.FihList[i].HeaderRevision)) + "\n "

+ self.OutputText += "Spec Version : " + \

+ str(hex(self.FihList[i].SpecVersion)) + "\n "

+ self.OutputText += "Image Revision : " + \

+ str(hex(self.FihList[i].ImageRevision)) + "\n "

+ self.OutputText += "Image Id : " + \

+ str(self.FihList[i].ImageId.decode('utf-8')) + "\n "

+ self.OutputText += "Image Size : " + \

+ str(hex(self.FihList[i].ImageSize)) + "\n "

+ self.OutputText += "Image Base : " + \

+ str(hex(self.FihList[i].ImageBase)) + "\n "

+ self.OutputText += "Image Attribute : " + \

+ str(hex(self.FihList[i].ImageAttribute)) + "\n "

+ self.OutputText += "Cfg Region Offset : " + \

+ str(hex(self.FihList[i].CfgRegionOffset)) + "\n "

+ self.OutputText += "Cfg Region Size : " + \

+ str(hex(self.FihList[i].CfgRegionSize)) + "\n "

+ self.OutputText += "API Entry Num : " + \

+ str(hex(self.FihList[i].Reserved2)) + "\n "

+ self.OutputText += "Temp Ram Init Entry : " + \

+ str(hex(self.FihList[i].TempRamInitEntryOffset)) + "\n "

+ self.OutputText += "FSP Init Entry : " + \

+ str(hex(self.FihList[i].Reserved3)) + "\n "

+ self.OutputText += "Notify Phase Entry : " + \

+ str(hex(self.FihList[i].NotifyPhaseEntryOffset)) + "\n "

+ self.OutputText += "Fsp Memory Init Entry : " + \

+ str(hex(self.FihList[i].FspMemoryInitEntryOffset)) + "\n "

+ self.OutputText += "Temp Ram Exit Entry : " + \

+ str(hex(self.FihList[i].TempRamExitEntryOffset)) + "\n "

+ self.OutputText += "Fsp Silicon Init Entry : " + \

+ str(hex(self.FihList[i].FspSiliconInitEntryOffset)) + "\n\n"

+ self.OutputText += "FSP Extended Header:\n "

+ self.OutputText += "Signature : " + \

+ str(self.FspExtList[i].Signature.decode('utf-8')) + "\n "

+ self.OutputText += "Header Length : " + \

+ str(hex(self.FspExtList[i].HeaderLength)) + "\n "

+ self.OutputText += "Header Revision : " + \

+ str(hex(self.FspExtList[i].Revision)) + "\n "

+ self.OutputText += "Fsp Producer Id : " + \

+ str(self.FspExtList[i].FspProducerId.decode('utf-8')) + "\n "

+ self.OutputText += "FspProducerRevision : " + \

+ str(hex(self.FspExtList[i].FspProducerRevision)) + "\n\n"

+ i += 1

+ text.insert(tkinter.INSERT, self.OutputText)

+ text.pack()

+ # Configure the scrollbars

+ scroll.config(command=text.yview)

+ copy_button = tkinter.Button(

+ window, text="Copy to Clipboard", command=copy_text_to_clipboard)

+ copy_button.pack(in_=frame, side=tkinter.LEFT, padx=20, pady=10)

+ exit_button = tkinter.Button(

+ window, text="Close", command=window.destroy)

+ exit_button.pack(in_=frame, side=tkinter.RIGHT, padx=20, pady=10)

+ window.mainloop()

+

+

+class state:

+ def __init__(self):

+ self.state = False

+

+ def set(self, value):

+ self.state = value

+

+ def get(self):

+ return self.state

+

+

+class application(tkinter.Frame):

+ def __init__(self, master=None):

+ root = master

+

+ self.debug = True

+ self.mode = 'FSP'

+ self.last_dir = '.'

+ self.page_id = ''

+ self.page_list = {}

+ self.conf_list = {}

+ self.cfg_data_obj = None

+ self.org_cfg_data_bin = None

+ self.in_left = state()

+ self.in_right = state()

+

+ # Check if current directory contains a file with a .yaml extension

+ # if not default self.last_dir to a Platform directory where it is

+ # easier to locate *BoardPkg\CfgData\*Def.yaml files

+ self.last_dir = '.'

+ if not any(fname.endswith('.yaml') for fname in os.listdir('.')):

+ platform_path = Path(os.path.realpath(__file__)).parents[2].\

+ joinpath('Platform')

+ if platform_path.exists():

+ self.last_dir = platform_path

+

+ tkinter.Frame.__init__(self, master, borderwidth=2)

+

+ self.menu_string = [

+ 'Save Config Data to Binary', 'Load Config Data from Binary',

+ 'Show Binary Information',

+ 'Load Config Changes from Delta File',

+ 'Save Config Changes to Delta File',

+ 'Save Full Config Data to Delta File',

+ 'Open Config BSF file'

+ ]

+

+ root.geometry("1200x800")

+

+ paned = ttk.Panedwindow(root, orient=tkinter.HORIZONTAL)

+ paned.pack(fill=tkinter.BOTH, expand=True, padx=(4, 4))

+

+ status = tkinter.Label(master, text="", bd=1, relief=tkinter.SUNKEN,

+ anchor=tkinter.W)

+ status.pack(side=tkinter.BOTTOM, fill=tkinter.X)

+

+ frame_left = ttk.Frame(paned, height=800, relief="groove")

+

+ self.left = ttk.Treeview(frame_left, show="tree")

+

+ # Set up tree HScroller

+ pady = (10, 10)

+ self.tree_scroll = ttk.Scrollbar(frame_left,

+ orient="vertical",

+ command=self.left.yview)

+ self.left.configure(yscrollcommand=self.tree_scroll.set)

+ self.left.bind("<<TreeviewSelect>>", self.on_config_page_select_change)

+ self.left.bind("<Enter>", lambda e: self.in_left.set(True))

+ self.left.bind("<Leave>", lambda e: self.in_left.set(False))

+ self.left.bind("<MouseWheel>", self.on_tree_scroll)

+

+ self.left.pack(side='left',

+ fill=tkinter.BOTH,

+ expand=True,

+ padx=(5, 0),

+ pady=pady)

+ self.tree_scroll.pack(side='right', fill=tkinter.Y,

+ pady=pady, padx=(0, 5))

+

+ frame_right = ttk.Frame(paned, relief="groove")

+ self.frame_right = frame_right

+

+ self.conf_canvas = tkinter.Canvas(frame_right, highlightthickness=0)

+ self.page_scroll = ttk.Scrollbar(frame_right,

+ orient="vertical",

+ command=self.conf_canvas.yview)

+ self.right_grid = ttk.Frame(self.conf_canvas)

+ self.conf_canvas.configure(yscrollcommand=self.page_scroll.set)

+ self.conf_canvas.pack(side='left',

+ fill=tkinter.BOTH,

+ expand=True,

+ pady=pady,

+ padx=(5, 0))

+ self.page_scroll.pack(side='right', fill=tkinter.Y,

+ pady=pady, padx=(0, 5))

+ self.conf_canvas.create_window(0, 0, window=self.right_grid,

+ anchor='nw')

+ self.conf_canvas.bind('<Enter>', lambda e: self.in_right.set(True))

+ self.conf_canvas.bind('<Leave>', lambda e: self.in_right.set(False))

+ self.conf_canvas.bind("<Configure>", self.on_canvas_configure)

+ self.conf_canvas.bind_all("<MouseWheel>", self.on_page_scroll)

+

+ paned.add(frame_left, weight=2)

+ paned.add(frame_right, weight=10)

+

+ style = ttk.Style()

+ style.layout("Treeview", [('Treeview.treearea', {'sticky': 'nswe'})])

+

+ menubar = tkinter.Menu(root)

+ file_menu = tkinter.Menu(menubar, tearoff=0)

+ file_menu.add_command(label="Open Config YAML file",

+ command=self.load_from_yaml)

+ file_menu.add_command(label=self.menu_string[6],

+ command=self.load_from_bsf_file)

+ file_menu.add_command(label=self.menu_string[2],

+ command=self.load_from_fd)

+ file_menu.add_command(label=self.menu_string[0],

+ command=self.save_to_bin,

+ state='disabled')

+ file_menu.add_command(label=self.menu_string[1],

+ command=self.load_from_bin,

+ state='disabled')

+ file_menu.add_command(label=self.menu_string[3],

+ command=self.load_from_delta,

+ state='disabled')

+ file_menu.add_command(label=self.menu_string[4],

+ command=self.save_to_delta,

+ state='disabled')

+ file_menu.add_command(label=self.menu_string[5],

+ command=self.save_full_to_delta,

+ state='disabled')

+ file_menu.add_command(label="About", command=self.about)

+ menubar.add_cascade(label="File", menu=file_menu)

+ self.file_menu = file_menu

+

+ root.config(menu=menubar)

+

+ if len(sys.argv) > 1:

+ path = sys.argv[1]

+ if not path.endswith('.yaml') and not path.endswith('.pkl'):

+ messagebox.showerror('LOADING ERROR',

+ "Unsupported file '%s' !" % path)

+ return

+ else:

+ self.load_cfg_file(path)

+

+ if len(sys.argv) > 2:

+ path = sys.argv[2]

+ if path.endswith('.dlt'):

+ self.load_delta_file(path)

+ elif path.endswith('.bin'):

+ self.load_bin_file(path)

+ else:

+ messagebox.showerror('LOADING ERROR',

+ "Unsupported file '%s' !" % path)

+ return

+

+ def set_object_name(self, widget, name):

+ self.conf_list[id(widget)] = name

+

+ def get_object_name(self, widget):

+ if id(widget) in self.conf_list:

+ return self.conf_list[id(widget)]

+ else:

+ return None

+

+ def limit_entry_size(self, variable, limit):

+ value = variable.get()

+ if len(value) > limit:

+ variable.set(value[:limit])

+

+ def on_canvas_configure(self, event):

+ self.right_grid.grid_columnconfigure(0, minsize=event.width)

+

+ def on_tree_scroll(self, event):

+ if not self.in_left.get() and self.in_right.get():

+ # This prevents scroll event from being handled by both left and

+ # right frame at the same time.

+ self.on_page_scroll(event)

+ return 'break'

+

+ def on_page_scroll(self, event):

+ if self.in_right.get():

+ # Only scroll when it is in active area

+ min, max = self.page_scroll.get()

+ if not((min == 0.0) and (max == 1.0)):

+ self.conf_canvas.yview_scroll(-1 * int(event.delta / 120),

+ 'units')

+

+ def update_visibility_for_widget(self, widget, args):

+

+ visible = True

+ item = self.get_config_data_item_from_widget(widget, True)

+ if item is None:

+ return visible

+ elif not item:

+ return visible

+

+ result = 1

+ if item['condition']:

+ result = self.evaluate_condition(item)

+ if result == 2:

+ # Gray

+ widget.configure(state='disabled')

+ elif result == 0:

+ # Hide

+ visible = False

+ widget.grid_remove()

+ else:

+ # Show

+ widget.grid()

+ widget.configure(state='normal')

+

+ return visible

+

+ def update_widgets_visibility_on_page(self):

+ self.walk_widgets_in_layout(self.right_grid,

+ self.update_visibility_for_widget)

+

+ def combo_select_changed(self, event):

+ self.update_config_data_from_widget(event.widget, None)

+ self.update_widgets_visibility_on_page()

+

+ def edit_num_finished(self, event):

+ widget = event.widget

+ item = self.get_config_data_item_from_widget(widget)

+ if not item:

+ return

+ parts = item['type'].split(',')

+ if len(parts) > 3:

+ min = parts[2].lstrip()[1:]

+ max = parts[3].rstrip()[:-1]

+ min_val = array_str_to_value(min)

+ max_val = array_str_to_value(max)

+ text = widget.get()

+ if ',' in text:

+ text = '{ %s }' % text

+ try:

+ value = array_str_to_value(text)

+ if value < min_val or value > max_val:

+ raise Exception('Invalid input!')

+ self.set_config_item_value(item, text)

+ except Exception:

+ pass

+

+ text = item['value'].strip('{').strip('}').strip()

+ widget.delete(0, tkinter.END)

+ widget.insert(0, text)

+

+ self.update_widgets_visibility_on_page()

+

+ def update_page_scroll_bar(self):

+ # Update scrollbar

+ self.frame_right.update()

+ self.conf_canvas.config(scrollregion=self.conf_canvas.bbox("all"))

+

+ def on_config_page_select_change(self, event):

+ self.update_config_data_on_page()

+ sel = self.left.selection()

+ if len(sel) > 0:

+ page_id = sel[0]

+ self.build_config_data_page(page_id)

+ self.update_widgets_visibility_on_page()

+ self.update_page_scroll_bar()

+

+ def walk_widgets_in_layout(self, parent, callback_function, args=None):

+ for widget in parent.winfo_children():

+ callback_function(widget, args)

+

+ def clear_widgets_inLayout(self, parent=None):

+ if parent is None:

+ parent = self.right_grid

+

+ for widget in parent.winfo_children():

+ widget.destroy()

+

+ parent.grid_forget()

+ self.conf_list.clear()

+

+ def build_config_page_tree(self, cfg_page, parent):

+ for page in cfg_page['child']:

+ page_id = next(iter(page))

+ # Put CFG items into related page list

+ self.page_list[page_id] = self.cfg_data_obj.get_cfg_list(page_id)

+ self.page_list[page_id].sort(key=lambda x: x['order'])

+ page_name = self.cfg_data_obj.get_page_title(page_id)

+ child = self.left.insert(

+ parent, 'end',

+ iid=page_id, text=page_name,

+ value=0)

+ if len(page[page_id]) > 0:

+ self.build_config_page_tree(page[page_id], child)

+

+ def is_config_data_loaded(self):

+ return True if len(self.page_list) else False

+

+ def set_current_config_page(self, page_id):

+ self.page_id = page_id

+

+ def get_current_config_page(self):

+ return self.page_id

+

+ def get_current_config_data(self):

+ page_id = self.get_current_config_page()

+ if page_id in self.page_list:

+ return self.page_list[page_id]

+ else:

+ return []

+

+ invalid_values = {}

+

+ def build_config_data_page(self, page_id):

+ self.clear_widgets_inLayout()

+ self.set_current_config_page(page_id)

+ disp_list = []

+ for item in self.get_current_config_data():

+ disp_list.append(item)

+ row = 0

+ disp_list.sort(key=lambda x: x['order'])

+ for item in disp_list:

+ self.add_config_item(item, row)

+ row += 2

+ if self.invalid_values:

+ string = 'The following contails invalid options/values \n\n'

+ for i in self.invalid_values:

+ string += i + ": " + str(self.invalid_values[i]) + "\n"

+ reply = messagebox.showwarning('Warning!', string)

+ if reply == 'ok':

+ self.invalid_values.clear()

+

+ fsp_version = ''

+

+ def load_config_data(self, file_name):

+ gen_cfg_data = CGenYamlCfg()

+ if file_name.endswith('.pkl'):

+ with open(file_name, "rb") as pkl_file:

+ gen_cfg_data.__dict__ = marshal.load(pkl_file)

+ gen_cfg_data.prepare_marshal(False)

+ elif file_name.endswith('.yaml'):

+ if gen_cfg_data.load_yaml(file_name) != 0:

+ raise Exception(gen_cfg_data.get_last_error())

+ else:

+ raise Exception('Unsupported file "%s" !' % file_name)

+ # checking fsp version

+ if gen_cfg_data.detect_fsp():

+ self.fsp_version = '2.X'

+ else:

+ self.fsp_version = '1.X'

+ return gen_cfg_data

+

+ def about(self):

+ msg = 'Configuration Editor\n--------------------------------\n \

+ Version 0.8\n2021'

+ lines = msg.split('\n')

+ width = 30

+ text = []

+ for line in lines:

+ text.append(line.center(width, ' '))

+ messagebox.showinfo('Config Editor', '\n'.join(text))

+

+ def update_last_dir(self, path):

+ self.last_dir = os.path.dirname(path)

+

+ def get_open_file_name(self, ftype):

+ if self.is_config_data_loaded():

+ if ftype == 'dlt':

+ question = ''

+ elif ftype == 'bin':

+ question = 'All configuration will be reloaded from BIN file, \

+ continue ?'

+ elif ftype == 'yaml':

+ question = ''

+ elif ftype == 'bsf':

+ question = ''

+ else:

+ raise Exception('Unsupported file type !')

+ if question:

+ reply = messagebox.askquestion('', question, icon='warning')

+ if reply == 'no':

+ return None

+

+ if ftype == 'yaml':

+ if self.mode == 'FSP':

+ file_type = 'YAML'

+ file_ext = 'yaml'

+ else:

+ file_type = 'YAML or PKL'

+ file_ext = 'pkl *.yaml'

+ else:

+ file_type = ftype.upper()

+ file_ext = ftype

+

+ path = filedialog.askopenfilename(

+ initialdir=self.last_dir,

+ title="Load file",

+ filetypes=(("%s files" % file_type, "*.%s" % file_ext), (

+ "all files", "*.*")))

+ if path:

+ self.update_last_dir(path)

+ return path

+ else:

+ return None

+

+ def load_from_delta(self):

+ path = self.get_open_file_name('dlt')

+ if not path:

+ return

+ self.load_delta_file(path)

+

+ def load_delta_file(self, path):

+ self.reload_config_data_from_bin(self.org_cfg_data_bin)

+ try:

+ self.cfg_data_obj.override_default_value(path)

+ except Exception as e:

+ messagebox.showerror('LOADING ERROR', str(e))

+ return

+ self.update_last_dir(path)

+ self.refresh_config_data_page()

+

+ def load_from_bin(self):

+ path = filedialog.askopenfilename(

+ initialdir=self.last_dir,

+ title="Load file",

+ filetypes={("Binaries", "*.fv *.fd *.bin *.rom")})

+ if not path:

+ return

+ self.load_bin_file(path)

+

+ def load_bin_file(self, path):

+ with open(path, 'rb') as fd:

+ bin_data = bytearray(fd.read())

+ if len(bin_data) < len(self.org_cfg_data_bin):

+ messagebox.showerror('Binary file size is smaller than what \

+ YAML requires !')

+ return

+

+ try:

+ self.reload_config_data_from_bin(bin_data)

+ except Exception as e:

+ messagebox.showerror('LOADING ERROR', str(e))

+ return

+

+ def load_from_bsf_file(self):

+ path = self.get_open_file_name('bsf')

+ if not path:

+ return

+ self.load_bsf_file(path)

+

+ def load_bsf_file(self, path):

+ bsf_file = path

+ dsc_file = os.path.splitext(bsf_file)[0] + '.dsc'

+ yaml_file = os.path.splitext(bsf_file)[0] + '.yaml'

+ bsf_to_dsc(bsf_file, dsc_file)

+ dsc_to_yaml(dsc_file, yaml_file)

+

+ self.load_cfg_file(yaml_file)

+ return

+

+ def load_from_fd(self):

+ path = filedialog.askopenfilename(

+ initialdir=self.last_dir,

+ title="Load file",

+ filetypes={("Binaries", "*.fv *.fd *.bin *.rom")})

+ if not path:

+ return

+ self.load_fd_file(path)

+

+ def load_fd_file(self, path):

+ with open(path, 'rb') as fd:

+ bin_data = bytearray(fd.read())

+

+ fd = FirmwareDevice(0, bin_data)

+ fd.ParseFd()

+ fd.ParseFsp()

+ fd.OutputFsp()

+

+ def load_cfg_file(self, path):

+ # Save current values in widget and clear database

+ self.clear_widgets_inLayout()

+ self.left.delete(*self.left.get_children())

+

+ self.cfg_data_obj = self.load_config_data(path)

+

+ self.update_last_dir(path)

+ self.org_cfg_data_bin = self.cfg_data_obj.generate_binary_array()

+ self.build_config_page_tree(self.cfg_data_obj.get_cfg_page()['root'],

+ '')

+

+ msg_string = 'Click YES if it is FULL FSP '\

+ + self.fsp_version + ' Binary'

+ reply = messagebox.askquestion('Form', msg_string)

+ if reply == 'yes':

+ self.load_from_bin()

+

+ for menu in self.menu_string:

+ self.file_menu.entryconfig(menu, state="normal")

+

+ return 0

+

+ def load_from_yaml(self):

+ path = self.get_open_file_name('yaml')

+ if not path:

+ return

+

+ self.load_cfg_file(path)

+

+ def get_save_file_name(self, extension):

+ path = filedialog.asksaveasfilename(

+ initialdir=self.last_dir,

+ title="Save file",

+ defaultextension=extension)

+ if path:

+ self.last_dir = os.path.dirname(path)

+ return path

+ else:

+ return None

+

+ def save_delta_file(self, full=False):

+ path = self.get_save_file_name(".dlt")

+ if not path:

+ return

+

+ self.update_config_data_on_page()

+ new_data = self.cfg_data_obj.generate_binary_array()

+ self.cfg_data_obj.generate_delta_file_from_bin(path,

+ self.org_cfg_data_bin,

+ new_data, full)

+

+ def save_to_delta(self):

+ self.save_delta_file()

+

+ def save_full_to_delta(self):

+ self.save_delta_file(True)

+

+ def save_to_bin(self):

+ path = self.get_save_file_name(".bin")

+ if not path:

+ return

+

+ self.update_config_data_on_page()

+ bins = self.cfg_data_obj.save_current_to_bin()

+

+ with open(path, 'wb') as fd:

+ fd.write(bins)

+

+ def refresh_config_data_page(self):

+ self.clear_widgets_inLayout()

+ self.on_config_page_select_change(None)

+

+ def reload_config_data_from_bin(self, bin_dat):

+ self.cfg_data_obj.load_default_from_bin(bin_dat)

+ self.refresh_config_data_page()

+

+ def set_config_item_value(self, item, value_str):

+ itype = item['type'].split(',')[0]

+ if itype == "Table":

+ new_value = value_str

+ elif itype == "EditText":

+ length = (self.cfg_data_obj.get_cfg_item_length(item) + 7) // 8

+ new_value = value_str[:length]

+ if item['value'].startswith("'"):

+ new_value = "'%s'" % new_value

+ else:

+ try:

+ new_value = self.cfg_data_obj.reformat_value_str(

+ value_str,

+ self.cfg_data_obj.get_cfg_item_length(item),

+ item['value'])

+ except Exception:

+ print("WARNING: Failed to format value string '%s' for '%s' !"

+ % (value_str, item['path']))

+ new_value = item['value']

+

+ if item['value'] != new_value:

+ if self.debug:

+ print('Update %s from %s to %s !'

+ % (item['cname'], item['value'], new_value))

+ item['value'] = new_value

+

+ def get_config_data_item_from_widget(self, widget, label=False):

+ name = self.get_object_name(widget)

+ if not name or not len(self.page_list):

+ return None

+

+ if name.startswith('LABEL_'):

+ if label:

+ path = name[6:]

+ else:

+ return None

+ else:

+ path = name

+ item = self.cfg_data_obj.get_item_by_path(path)

+ return item

+

+ def update_config_data_from_widget(self, widget, args):

+ item = self.get_config_data_item_from_widget(widget)

+ if item is None:

+ return

+ elif not item:

+ if isinstance(widget, tkinter.Label):

+ return

+ raise Exception('Failed to find "%s" !' %

+ self.get_object_name(widget))

+

+ itype = item['type'].split(',')[0]

+ if itype == "Combo":

+ opt_list = self.cfg_data_obj.get_cfg_item_options(item)

+ tmp_list = [opt[0] for opt in opt_list]

+ idx = widget.current()

+ if idx != -1:

+ self.set_config_item_value(item, tmp_list[idx])

+ elif itype in ["EditNum", "EditText"]:

+ self.set_config_item_value(item, widget.get())

+ elif itype in ["Table"]:

+ new_value = bytes_to_bracket_str(widget.get())

+ self.set_config_item_value(item, new_value)

+

+ def evaluate_condition(self, item):

+ try:

+ result = self.cfg_data_obj.evaluate_condition(item)

+ except Exception:

+ print("WARNING: Condition '%s' is invalid for '%s' !"

+ % (item['condition'], item['path']))

+ result = 1

+ return result

+

+ def add_config_item(self, item, row):

+ parent = self.right_grid

+

+ name = tkinter.Label(parent, text=item['name'], anchor="w")

+

+ parts = item['type'].split(',')

+ itype = parts[0]

+ widget = None

+

+ if itype == "Combo":

+ # Build

+ opt_list = self.cfg_data_obj.get_cfg_item_options(item)

+ current_value = self.cfg_data_obj.get_cfg_item_value(item, False)

+ option_list = []

+ current = None

+

+ for idx, option in enumerate(opt_list):

+ option_str = option[0]

+ try:

+ option_value = self.cfg_data_obj.get_value(

+ option_str,

+ len(option_str), False)

+ except Exception:

+ option_value = 0

+ print('WARNING: Option "%s" has invalid format for "%s" !'

+ % (option_str, item['path']))

+ if option_value == current_value:

+ current = idx

+ option_list.append(option[1])

+

+ widget = ttk.Combobox(parent, value=option_list, state="readonly")

+ widget.bind("<<ComboboxSelected>>", self.combo_select_changed)

+ widget.unbind_class("TCombobox", "<MouseWheel>")

+

+ if current is None:

+ print('WARNING: Value "%s" is an invalid option for "%s" !' %

+ (current_value, item['path']))

+ self.invalid_values[item['path']] = current_value

+ else:

+ widget.current(current)

+

+ elif itype in ["EditNum", "EditText"]:

+ txt_val = tkinter.StringVar()

+ widget = tkinter.Entry(parent, textvariable=txt_val)

+ value = item['value'].strip("'")

+ if itype in ["EditText"]:

+ txt_val.trace(

+ 'w',

+ lambda *args: self.limit_entry_size

+ (txt_val, (self.cfg_data_obj.get_cfg_item_length(item)

+ + 7) // 8))

+ elif itype in ["EditNum"]:

+ value = item['value'].strip("{").strip("}").strip()

+ widget.bind("<FocusOut>", self.edit_num_finished)

+ txt_val.set(value)

+

+ elif itype in ["Table"]:

+ bins = self.cfg_data_obj.get_cfg_item_value(item, True)

+ col_hdr = item['option'].split(',')

+ widget = custom_table(parent, col_hdr, bins)

+

+ else:

+ if itype and itype not in ["Reserved"]:

+ print("WARNING: Type '%s' is invalid for '%s' !" %

+ (itype, item['path']))

+ self.invalid_values[item['path']] = itype

+

+ if widget:

+ create_tool_tip(widget, item['help'])

+ self.set_object_name(name, 'LABEL_' + item['path'])

+ self.set_object_name(widget, item['path'])

+ name.grid(row=row, column=0, padx=10, pady=5, sticky="nsew")

+ widget.grid(row=row + 1, rowspan=1, column=0,

+ padx=10, pady=5, sticky="nsew")

+

+ def update_config_data_on_page(self):

+ self.walk_widgets_in_layout(self.right_grid,

+ self.update_config_data_from_widget)

+

+

+if __name__ == '__main__':

+ root = tkinter.Tk()

+ app = application(master=root)

+ root.title("Config Editor")

+ root.mainloop()

diff --git a/IntelFsp2Pkg/Tools/ConfigEditor/GenYamlCfg.py
b/IntelFsp2Pkg/Tools/ConfigEditor/GenYamlCfg.py
new file mode 100644
index 0000000000..25fd9c547e
--- /dev/null
+++ b/IntelFsp2Pkg/Tools/ConfigEditor/GenYamlCfg.py
@@ -0,0 +1,2252 @@
+# @ GenYamlCfg.py

+#

+# Copyright (c) 2020 - 2021, Intel Corporation. All rights reserved.<BR>

+# SPDX-License-Identifier: BSD-2-Clause-Patent

+#

+#

+

+import os

+import sys

+import re

+import marshal

+import string

+import operator as op

+import ast

+import tkinter.messagebox as messagebox

+

+from datetime import date

+from collections import OrderedDict

+from CommonUtility import value_to_bytearray, value_to_bytes, \

+ bytes_to_value, get_bits_from_bytes, set_bits_to_bytes

+

+# Generated file copyright header

+__copyright_tmp__ = """/** @file

+

+ Platform Configuration %s File.

+

+ Copyright (c) %4d, Intel Corporation. All rights reserved.<BR>

+ SPDX-License-Identifier: BSD-2-Clause-Patent

+

+ This file is automatically generated. Please do NOT modify !!!

+

+**/

+"""

+

+

+def get_copyright_header(file_type, allow_modify=False):

+ file_description = {

+ 'yaml': 'Boot Setting',

+ 'dlt': 'Delta',

+ 'inc': 'C Binary Blob',

+ 'h': 'C Struct Header'

+ }

+ if file_type in ['yaml', 'dlt']:

+ comment_char = '#'

+ else:

+ comment_char = ''

+ lines = __copyright_tmp__.split('\n')

+ if allow_modify:

+ lines = [line for line in lines if 'Please do NOT modify' not in line]

+ copyright_hdr = '\n'.join('%s%s' % (comment_char, line)

+ for line in lines)[:-1] + '\n'

+ return copyright_hdr % (file_description[file_type], date.today().year)

+

+

+def check_quote(text):

+ if (text[0] == "'" and text[-1] == "'") or (text[0] == '"'

+ and text[-1] == '"'):

+ return True

+ return False

+

+

+def strip_quote(text):

+ new_text = text.strip()

+ if check_quote(new_text):

+ return new_text[1:-1]

+ return text

+

+

+def strip_delimiter(text, delim):

+ new_text = text.strip()

+ if new_text:

+ if new_text[0] == delim[0] and new_text[-1] == delim[-1]:

+ return new_text[1:-1]

+ return text

+

+

+def bytes_to_bracket_str(bytes):

+ return '{ %s }' % (', '.join('0x%02x' % i for i in bytes))

+

+

+def array_str_to_value(val_str):

+ val_str = val_str.strip()

+ val_str = strip_delimiter(val_str, '{}')

+ val_str = strip_quote(val_str)

+ value = 0

+ for each in val_str.split(',')[::-1]:

+ each = each.strip()

+ value = (value << 8) | int(each, 0)

+ return value

+

+

+def write_lines(lines, file):

+ fo = open(file, "w")

+ fo.write(''.join([x[0] for x in lines]))

+ fo.close()

+

+

+def read_lines(file):

+ if not os.path.exists(file):

+ test_file = os.path.basename(file)

+ if os.path.exists(test_file):

+ file = test_file

+ fi = open(file, 'r')

+ lines = fi.readlines()

+ fi.close()

+ return lines

+

+

+def expand_file_value(path, value_str):

+ result = bytearray()

+ match = re.match("\\{\\s*FILE:(.+)\\}", value_str)

+ if match:

+ file_list = match.group(1).split(',')

+ for file in file_list:

+ file = file.strip()

+ bin_path = os.path.join(path, file)

+ result.extend(bytearray(open(bin_path, 'rb').read()))

+ print('\n\n result ', result)

+ return result

+

+

+class ExpressionEval(ast.NodeVisitor):

+ operators = {

+ ast.Add: op.add,

+ ast.Sub: op.sub,

+ ast.Mult: op.mul,

+ ast.Div: op.floordiv,

+ ast.Mod: op.mod,

+ ast.Eq: op.eq,

+ ast.NotEq: op.ne,

+ ast.Gt: op.gt,

+ ast.Lt: op.lt,

+ ast.GtE: op.ge,

+ ast.LtE: op.le,

+ ast.BitXor: op.xor,

+ ast.BitAnd: op.and_,

+ ast.BitOr: op.or_,

+ ast.Invert: op.invert,

+ ast.USub: op.neg

+ }

+

+ def __init__(self):

+ self._debug = False

+ self._expression = ''

+ self._namespace = {}

+ self._get_variable = None

+

+ def eval(self, expr, vars={}):

+ self._expression = expr

+ if type(vars) is dict:

+ self._namespace = vars

+ self._get_variable = None

+ else:

+ self._namespace = {}

+ self._get_variable = vars

+ node = ast.parse(self._expression, mode='eval')

+ result = self.visit(node.body)

+ if self._debug:

+ print('EVAL [ %s ] = %s' % (expr, str(result)))

+ return result

+

+ def visit_Name(self, node):

+ if self._get_variable is not None:

+ return self._get_variable(node.id)

+ else:

+ return self._namespace[node.id]

+

+ def visit_Num(self, node):

+ return node.n

+

+ def visit_NameConstant(self, node):

+ return node.value

+

+ def visit_BoolOp(self, node):

+ result = False

+ if isinstance(node.op, ast.And):

+ for value in node.values:

+ result = self.visit(value)

+ if not result:

+ break

+ elif isinstance(node.op, ast.Or):

+ for value in node.values:

+ result = self.visit(value)

+ if result:

+ break

+ return True if result else False

+

+ def visit_UnaryOp(self, node):

+ val = self.visit(node.operand)

+ return ExpressionEval.operators[type(node.op)](val)

+

+ def visit_BinOp(self, node):

+ lhs = self.visit(node.left)

+ rhs = self.visit(node.right)

+ return ExpressionEval.operators[type(node.op)](lhs, rhs)

+

+ def visit_Compare(self, node):

+ right = self.visit(node.left)

+ result = True

+ for operation, comp in zip(node.ops, node.comparators):

+ if not result:

+ break

+ left = right

+ right = self.visit(comp)

+ result = ExpressionEval.operators[type(operation)](left, right)

+ return result

+

+ def visit_Call(self, node):

+ if node.func.id in ['ternary']:

+ condition = self.visit(node.args[0])

+ val_true = self.visit(node.args[1])

+ val_false = self.visit(node.args[2])

+ return val_true if condition else val_false

+ elif node.func.id in ['offset', 'length']:

+ if self._get_variable is not None:

+ return self._get_variable(node.args[0].s, node.func.id)

+ else:

+ raise ValueError("Unsupported function: " + repr(node))

+

+ def generic_visit(self, node):

+ raise ValueError("malformed node or string: " + repr(node))

+

+

+class CFG_YAML():

+ TEMPLATE = 'template'

+ CONFIGS = 'configs'

+ VARIABLE = 'variable'

+

+ def __init__(self):

+ self.log_line = False

+ self.allow_template = False

+ self.cfg_tree = None

+ self.tmp_tree = None

+ self.var_dict = None

+ self.def_dict = {}

+ self.yaml_path = ''

+ self.lines = []

+ self.full_lines = []

+ self.index = 0

+ self.re_expand = re.compile(

+ r'(.+:\s+|\s*\-\s*)!expand\s+\{\s*(\w+_TMPL)\s*:\s*\[(.+)]\s*\}')

+ self.re_include = re.compile(r'(.+:\s+|\s*\-\s*)!include\s+(.+)')

+

+ @staticmethod

+ def count_indent(line):

+ return next((i for i, c in enumerate(line) if not c.isspace()),

+ len(line))

+

+ @staticmethod

+ def substitue_args(text, arg_dict):

+ for arg in arg_dict:

+ text = text.replace('$' + arg, arg_dict[arg])

+ return text

+

+ @staticmethod

+ def dprint(*args):

+ pass

+

+ def process_include(self, line, insert=True):

+ match = self.re_include.match(line)

+ if not match:

+ raise Exception("Invalid !include format '%s' !" % line.strip())

+

+ prefix = match.group(1)

+ include = match.group(2)

+ if prefix.strip() == '-':

+ prefix = ''

+ adjust = 0

+ else:

+ adjust = 2

+

+ include = strip_quote(include)

+ request = CFG_YAML.count_indent(line) + adjust

+

+ if self.log_line:

+ # remove the include line itself

+ del self.full_lines[-1]

+

+ inc_path = os.path.join(self.yaml_path, include)

+ if not os.path.exists(inc_path):

+ # try relative path to project root

+ try_path = os.path.join(os.path.dirname(os.path.realpath(__file__)

+ ), "../..", include)

+ if os.path.exists(try_path):

+ inc_path = try_path

+ else:

+ raise Exception("ERROR: Cannot open file '%s'." % inc_path)

+

+ lines = read_lines(inc_path)

+ current = 0

+ same_line = False

+ for idx, each in enumerate(lines):

+ start = each.lstrip()

+ if start == '' or start[0] == '#':

+ continue

+

+ if start[0] == '>':

+ # append the content directly at the same line

+ same_line = True

+

+ start = idx

+ current = CFG_YAML.count_indent(each)

+ break

+

+ lines = lines[start+1:] if same_line else lines[start:]

+ leading = ''

+ if same_line:

+ request = len(prefix)

+ leading = '>'

+

+ lines = [prefix + '%s\n' % leading] + [' ' * request +

+ i[current:] for i in lines]

+ if insert:

+ self.lines = lines + self.lines

+

+ return lines

+

+ def process_expand(self, line):

+ match = self.re_expand.match(line)

+ if not match:

+ raise Exception("Invalid !expand format '%s' !" % line.strip())

+ lines = []

+ prefix = match.group(1)

+ temp_name = match.group(2)

+ args = match.group(3)

+

+ if prefix.strip() == '-':

+ indent = 0

+ else:

+ indent = 2

+ lines = self.process_expand_template(temp_name, prefix, args, indent)

+ self.lines = lines + self.lines

+

+ def process_expand_template(self, temp_name, prefix, args, indent=2):

+ # expand text with arg substitution

+ if temp_name not in self.tmp_tree:

+ raise Exception("Could not find template '%s' !" % temp_name)

+ parts = args.split(',')

+ parts = [i.strip() for i in parts]

+ num = len(parts)

+ arg_dict = dict(zip(['(%d)' % (i + 1) for i in range(num)], parts))

+ str_data = self.tmp_tree[temp_name]

+ text = DefTemplate(str_data).safe_substitute(self.def_dict)

+ text = CFG_YAML.substitue_args(text, arg_dict)

+ target = CFG_YAML.count_indent(prefix) + indent

+ current = CFG_YAML.count_indent(text)

+ padding = target * ' '

+ if indent == 0:

+ leading = []

+ else:

+ leading = [prefix + '\n']

+ text = leading + [(padding + i + '\n')[current:]

+ for i in text.splitlines()]

+ return text

+

+ def load_file(self, yaml_file):

+ self.index = 0

+ self.lines = read_lines(yaml_file)

+

+ def peek_line(self):

+ if len(self.lines) == 0:

+ return None

+ else:

+ return self.lines[0]

+

+ def put_line(self, line):

+ self.lines.insert(0, line)

+ if self.log_line:

+ del self.full_lines[-1]

+

+ def get_line(self):

+ if len(self.lines) == 0:

+ return None

+ else:

+ line = self.lines.pop(0)

+ if self.log_line:

+ self.full_lines.append(line.rstrip())

+ return line

+

+ def get_multiple_line(self, indent):

+ text = ''

+ newind = indent + 1

+ while True:

+ line = self.peek_line()

+ if line is None:

+ break

+ sline = line.strip()

+ if sline != '':

+ newind = CFG_YAML.count_indent(line)

+ if newind <= indent:

+ break

+ self.get_line()

+ if sline != '':

+ text = text + line

+ return text

+

+ def traverse_cfg_tree(self, handler):

+ def _traverse_cfg_tree(root, level=0):

+ # config structure

+ for key in root:

+ if type(root[key]) is OrderedDict:

+ level += 1

+ handler(key, root[key], level)

+ _traverse_cfg_tree(root[key], level)

+ level -= 1

+ _traverse_cfg_tree(self.cfg_tree)

+

+ def count(self):

+ def _count(name, cfgs, level):

+ num[0] += 1

+ num = [0]

+ self.traverse_cfg_tree(_count)

+ return num[0]

+

+ def parse(self, parent_name='', curr=None, level=0):

+ child = None

+ last_indent = None

+ key = ''

+ temp_chk = {}

+

+ while True:

+ line = self.get_line()

+ if line is None:

+ break

+

+ curr_line = line.strip()

+ if curr_line == '' or curr_line[0] == '#':

+ continue

+

+ indent = CFG_YAML.count_indent(line)

+ if last_indent is None:

+ last_indent = indent

+

+ if indent != last_indent:

+ # outside of current block, put the line back to queue

+ self.put_line(' ' * indent + curr_line)

+

+ if curr_line.endswith(': >'):

+ # multiline marker

+ old_count = len(self.full_lines)

+ line = self.get_multiple_line(indent)

+ if self.log_line and not self.allow_template \

+ and '!include ' in line:

+ # expand include in template

+ new_lines = []

+ lines = line.splitlines()

+ for idx, each in enumerate(lines):

+ if '!include ' in each:

+ new_line = ''.join(self.process_include(each,

+ False))

+ new_lines.append(new_line)

+ else:

+ new_lines.append(each)

+ self.full_lines = self.full_lines[:old_count] + new_lines

+ curr_line = curr_line + line

+

+ if indent > last_indent:

+ # child nodes

+ if child is None:

+ raise Exception('Unexpected format at line: %s'

+ % (curr_line))

+

+ level += 1

+ self.parse(key, child, level)

+ level -= 1

+ line = self.peek_line()

+ if line is not None:

+ curr_line = line.strip()

+ indent = CFG_YAML.count_indent(line)

+ if indent >= last_indent:

+ # consume the line

+ self.get_line()

+ else:

+ # end of file

+ indent = -1

+

+ if curr is None:

+ curr = OrderedDict()

+

+ if indent < last_indent:

+ return curr

+

+ marker1 = curr_line[0]

+ marker2 = curr_line[-1]

+ start = 1 if marker1 == '-' else 0

+ pos = curr_line.find(': ')

+ if pos > 0:

+ child = None

+ key = curr_line[start:pos].strip()

+ if curr_line[pos + 2] == '>':

+ curr[key] = curr_line[pos + 3:]

+ else:

+ # XXXX: !include / !expand

+ if '!include ' in curr_line:

+ self.process_include(line)

+ elif '!expand ' in curr_line:

+ if self.allow_template and not self.log_line:

+ self.process_expand(line)

+ else:

+ value_str = curr_line[pos + 2:].strip()

+ curr[key] = value_str

+ if self.log_line and value_str[0] == '{':

+ # expand {FILE: xxxx} format in the log line

+ if value_str[1:].rstrip().startswith('FILE:'):

+ value_bytes = expand_file_value(

+ self.yaml_path, value_str)

+ value_str = bytes_to_bracket_str(value_bytes)

+ self.full_lines[-1] = line[

+ :indent] + curr_line[:pos + 2] + value_str

+

+ elif marker2 == ':':

+ child = OrderedDict()

+ key = curr_line[start:-1].strip()

+ if key == '$ACTION':

+ # special virtual nodes, rename to ensure unique key

+ key = '$ACTION_%04X' % self.index

+ self.index += 1

+ if key in curr:

+ if key not in temp_chk:

+ # check for duplicated keys at same level

+ temp_chk[key] = 1

+ else:

+ raise Exception("Duplicated item '%s:%s' found !"

+ % (parent_name, key))

+

+ curr[key] = child

+ if self.var_dict is None and key == CFG_YAML.VARIABLE:

+ self.var_dict = child

+ if self.tmp_tree is None and key == CFG_YAML.TEMPLATE:

+ self.tmp_tree = child

+ if self.var_dict:

+ for each in self.var_dict:

+ txt = self.var_dict[each]

+ if type(txt) is str:

+ self.def_dict['(%s)' % each] = txt

+ if self.tmp_tree and key == CFG_YAML.CONFIGS:

+ # apply template for the main configs

+ self.allow_template = True

+ else:

+ child = None

+ # - !include cfg_opt.yaml

+ if '!include ' in curr_line:

+ self.process_include(line)

+

+ return curr

+

+ def load_yaml(self, opt_file):

+ self.var_dict = None

+ self.yaml_path = os.path.dirname(opt_file)

+ self.load_file(opt_file)

+ yaml_tree = self.parse()

+ self.tmp_tree = yaml_tree[CFG_YAML.TEMPLATE]

+ self.cfg_tree = yaml_tree[CFG_YAML.CONFIGS]

+ return self.cfg_tree

+

+ def expand_yaml(self, opt_file):

+ self.log_line = True

+ self.load_yaml(opt_file)

+ self.log_line = False

+ text = '\n'.join(self.full_lines)

+ self.full_lines = []

+ return text

+

+

+class DefTemplate(string.Template):

+ idpattern = '\\([_A-Z][_A-Z0-9]*\\)|[_A-Z][_A-Z0-9]*'

+

+

+class CGenYamlCfg:

+ STRUCT = '$STRUCT'

+ bits_width = {'b': 1, 'B': 8, 'W': 16, 'D': 32, 'Q': 64}

+ builtin_option = {'$EN_DIS': [('0', 'Disable'), ('1', 'Enable')]}

+ exclude_struct = ['FSP_UPD_HEADER', 'FSPT_ARCH_UPD',

+ 'FSPM_ARCH_UPD', 'FSPS_ARCH_UPD',

+ 'GPIO_GPP_*', 'GPIO_CFG_DATA',

+ 'GpioConfPad*', 'GpioPinConfig',

+ 'BOOT_OPTION*', 'PLATFORMID_CFG_DATA', '\\w+_Half[01]']

+ include_tag = ['GPIO_CFG_DATA']

+ keyword_set = set(['name', 'type', 'option', 'help', 'length',

+ 'value', 'order', 'struct', 'condition'])

+

+ def __init__(self):

+ self._mode = ''

+ self._debug = False

+ self._macro_dict = {}

+ self.initialize()

+

+ def initialize(self):

+ self._old_bin = None

+ self._cfg_tree = {}

+ self._tmp_tree = {}

+ self._cfg_list = []

+ self._cfg_page = {'root': {'title': '', 'child': []}}

+ self._cur_page = ''

+ self._var_dict = {}

+ self._def_dict = {}

+ self._yaml_path = ''

+

+ @staticmethod

+ def deep_convert_dict(layer):

+ # convert OrderedDict to list + dict

+ new_list = layer

+ if isinstance(layer, OrderedDict):

+ new_list = list(layer.items())

+ for idx, pair in enumerate(new_list):

+ new_node = CGenYamlCfg.deep_convert_dict(pair[1])

+ new_list[idx] = dict({pair[0]: new_node})

+ return new_list

+

+ @staticmethod

+ def deep_convert_list(layer):

+ if isinstance(layer, list):

+ od = OrderedDict({})

+ for each in layer:

+ if isinstance(each, dict):

+ key = next(iter(each))

+ od[key] = CGenYamlCfg.deep_convert_list(each[key])

+ return od

+ else:

+ return layer

+

+ @staticmethod

+ def expand_include_files(file_path, cur_dir=''):

+ if cur_dir == '':

+ cur_dir = os.path.dirname(file_path)

+ file_path = os.path.basename(file_path)

+

+ input_file_path = os.path.join(cur_dir, file_path)

+ file = open(input_file_path, "r")

+ lines = file.readlines()

+ file.close()

+ new_lines = []

+ for line_num, line in enumerate(lines):

+ match = re.match("^!include\\s*(.+)?$", line.strip())

+ if match:

+ inc_path = match.group(1)

+ tmp_path = os.path.join(cur_dir, inc_path)

+ org_path = tmp_path

+ if not os.path.exists(tmp_path):

+ cur_dir = os.path.join(os.path.dirname

+ (os.path.realpath(__file__)

+ ), "..", "..")

+ tmp_path = os.path.join(cur_dir, inc_path)

+ if not os.path.exists(tmp_path):

+ raise Exception("ERROR: Cannot open include\

+ file '%s'." % org_path)

+ else:

+ new_lines.append(('# Included from file: %s\n' % inc_path,

+ tmp_path, 0))

+ new_lines.append(('# %s\n' % ('=' * 80), tmp_path, 0))

+ new_lines.extend(CGenYamlCfg.expand_include_files

+ (inc_path, cur_dir))

+ else:

+ new_lines.append((line, input_file_path, line_num))

+

+ return new_lines

+

+ @staticmethod

+ def format_struct_field_name(input, count=0):

+ name = ''

+ cap = True

+ if '_' in input:

+ input = input.lower()

+ for each in input:

+ if each == '_':

+ cap = True

+ continue

+ elif cap:

+ each = each.upper()

+ cap = False

+ name = name + each

+

+ if count > 1:

+ name = '%s[%d]' % (name, count)

+

+ return name

+

+ def get_mode(self):

+ return self._mode

+

+ def set_mode(self, mode):

+ self._mode = mode

+

+ def get_last_error(self):

+ return ''

+

+ def get_variable(self, var, attr='value'):

+ if var in self._var_dict:

+ var = self._var_dict[var]

+ return var

+

+ item = self.locate_cfg_item(var, False)

+ if item is None:

+ raise ValueError("Cannot find variable '%s' !" % var)

+

+ if item:

+ if 'indx' in item:

+ item = self.get_item_by_index(item['indx'])

+ if attr == 'offset':

+ var = item['offset']

+ elif attr == 'length':

+ var = item['length']

+ elif attr == 'value':

+ var = self.get_cfg_item_value(item)

+ else:

+ raise ValueError("Unsupported variable attribute '%s' !" %

+ attr)

+ return var

+

+ def eval(self, expr):

+ def _handler(pattern):

+ if pattern.group(1):

+ target = 1

+ else:

+ target = 2

+ result = self.get_variable(pattern.group(target))

+ if result is None:

+ raise ValueError('Unknown variable $(%s) !' %

+ pattern.group(target))

+ return hex(result)

+

+ expr_eval = ExpressionEval()

+ if '$' in expr:

+ # replace known variable first

+ expr = re.sub(r'\$\(([_a-zA-Z][\w\.]*)\)|\$([_a-zA-Z][\w\.]*)',

+ _handler, expr)

+ return expr_eval.eval(expr, self.get_variable)

+

+ def parse_macros(self, macro_def_str):

+ # ['-DABC=1', '-D', 'CFG_DEBUG=1', '-D', 'CFG_OUTDIR=Build']

+ self._macro_dict = {}

+ is_expression = False

+ for macro in macro_def_str:

+ if macro.startswith('-D'):

+ is_expression = True

+ if len(macro) > 2:

+ macro = macro[2:]

+ else:

+ continue

+ if is_expression:

+ is_expression = False

+ match = re.match("(\\w+)=(.+)", macro)

+ if match:

+ self._macro_dict[match.group(1)] = match.group(2)

+ else:

+ match = re.match("(\\w+)", macro)

+ if match:

+ self._macro_dict[match.group(1)] = ''

+ if len(self._macro_dict) == 0:

+ error = 1

+ else:

+ error = 0

+ if self._debug:

+ print("INFO : Macro dictionary:")

+ for each in self._macro_dict:

+ print(" $(%s) = [ %s ]"

+ % (each, self._macro_dict[each]))

+ return error

+

+ def get_cfg_list(self, page_id=None):

+ if page_id is None:

+ # return full list

+ return self._cfg_list

+ else:

+ # build a new list for items under a page ID

+ cfgs = [i for i in self._cfg_list if i['cname'] and

+ (i['page'] == page_id)]

+ return cfgs

+

+ def get_cfg_page(self):

+ return self._cfg_page

+

+ def get_cfg_item_length(self, item):

+ return item['length']

+

+ def get_cfg_item_value(self, item, array=False):

+ value_str = item['value']

+ length = item['length']

+ return self.get_value(value_str, length, array)

+

+ def format_value_to_str(self, value, bit_length, old_value=''):

+ # value is always int

+ length = (bit_length + 7) // 8

+ fmt = ''

+ if old_value.startswith('0x'):

+ fmt = '0x'

+ elif old_value and (old_value[0] in ['"', "'", '{']):

+ fmt = old_value[0]

+ else:

+ fmt = ''

+

+ bvalue = value_to_bytearray(value, length)

+ if fmt in ['"', "'"]:

+ svalue = bvalue.rstrip(b'\x00').decode()

+ value_str = fmt + svalue + fmt

+ elif fmt == "{":

+ value_str = '{ ' + ', '.join(['0x%02x' % i for i in bvalue]) + ' }'

+ elif fmt == '0x':

+ hex_len = length * 2

+ if len(old_value) == hex_len + 2:

+ fstr = '0x%%0%dx' % hex_len

+ else:

+ fstr = '0x%x'

+ value_str = fstr % value

+ else:

+ if length <= 2:

+ value_str = '%d' % value

+ elif length <= 8:

+ value_str = '0x%x' % value

+ else:

+ value_str = '{ ' + ', '.join(['0x%02x' % i for i in

+ bvalue]) + ' }'

+ return value_str

+

+ def reformat_value_str(self, value_str, bit_length, old_value=None):

+ value = self.parse_value(value_str, bit_length, False)

+ if old_value is None:

+ old_value = value_str

+ new_value = self.format_value_to_str(value, bit_length, old_value)

+ return new_value

+

+ def get_value(self, value_str, bit_length, array=True):

+ value_str = value_str.strip()

+ if value_str[0] == "'" and value_str[-1] == "'" or \

+ value_str[0] == '"' and value_str[-1] == '"':

+ value_str = value_str[1:-1]

+ bvalue = bytearray(value_str.encode())

+ if len(bvalue) == 0:

+ bvalue = bytearray(b'\x00')

+ if array:

+ return bvalue

+ else:

+ return bytes_to_value(bvalue)

+ else:

+ if value_str[0] in '{':

+ value_str = value_str[1:-1].strip()

+ value = 0

+ for each in value_str.split(',')[::-1]:

+ each = each.strip()

+ value = (value << 8) | int(each, 0)

+ if array:

+ length = (bit_length + 7) // 8

+ return value_to_bytearray(value, length)

+ else:

+ return value

+

+ def parse_value(self, value_str, bit_length, array=True):

+ length = (bit_length + 7) // 8

+ if check_quote(value_str):

+ value_str = bytes_to_bracket_str(value_str[1:-1].encode())

+ elif (',' in value_str) and (value_str[0] != '{'):

+ value_str = '{ %s }' % value_str

+ if value_str[0] == '{':

+ result = expand_file_value(self._yaml_path, value_str)

+ if len(result) == 0:

+ bin_list = value_str[1:-1].split(',')

+ value = 0

+ bit_len = 0

+ unit_len = 1

+ for idx, element in enumerate(bin_list):

+ each = element.strip()

+ if len(each) == 0:

+ continue

+

+ in_bit_field = False

+ if each[0] in "'" + '"':

+ each_value = bytearray(each[1:-1], 'utf-8')

+ elif ':' in each:

+ match = re.match("^(.+):(\\d+)([b|B|W|D|Q])$", each)

+ if match is None:

+ raise SystemExit("Exception: Invald value\

+list format '%s' !" % each)

+ if match.group(1) == '0' and match.group(2) == '0':

+ unit_len = CGenYamlCfg.bits_width[match.group(3)

+ ] // 8

+ cur_bit_len = int(match.group(2)

+ ) * CGenYamlCfg.bits_width[

+ match.group(3)]

+ value += ((self.eval(match.group(1)) & (

+ 1 << cur_bit_len) - 1)) << bit_len

+ bit_len += cur_bit_len

+ each_value = bytearray()

+ if idx + 1 < len(bin_list):

+ in_bit_field = True

+ else:

+ try:

+ each_value = value_to_bytearray(

+ self.eval(each.strip()), unit_len)

+ except Exception:

+ raise SystemExit("Exception: Value %d cannot \

+fit into %s bytes !" % (each, unit_len))

+

+ if not in_bit_field:

+ if bit_len > 0:

+ if bit_len % 8 != 0:

+ raise SystemExit("Exception: Invalid bit \

+field alignment '%s' !" % value_str)

+ result.extend(value_to_bytes(value, bit_len // 8))

+ value = 0

+ bit_len = 0

+

+ result.extend(each_value)

+

+ elif check_quote(value_str):

+ result = bytearray(value_str[1:-1], 'utf-8') # Excluding quotes

+ else:

+ result = value_to_bytearray(self.eval(value_str), length)

+

+ if len(result) < length:

+ result.extend(b'\x00' * (length - len(result)))

+ elif len(result) > length:

+ raise SystemExit("Exception: Value '%s' is too big to fit \

+into %d bytes !" % (value_str, length))

+

+ if array:

+ return result

+ else:

+ return bytes_to_value(result)

+

+ return result

+

+ def get_cfg_item_options(self, item):

+ tmp_list = []

+ if item['type'] == "Combo":

+ if item['option'] in CGenYamlCfg.builtin_option:

+ for op_val, op_str in CGenYamlCfg.builtin_option[item['option'

+ ]]:

+ tmp_list.append((op_val, op_str))

+ else:

+ opt_list = item['option'].split(',')

+ for option in opt_list:

+ option = option.strip()

+ try:

+ (op_val, op_str) = option.split(':')

+ except Exception:

+ raise SystemExit("Exception: Invalide \

+option format '%s' !" % option)

+ tmp_list.append((op_val, op_str))

+ return tmp_list

+

+ def get_page_title(self, page_id, top=None):

+ if top is None:

+ top = self.get_cfg_page()['root']

+ for node in top['child']:

+ page_key = next(iter(node))

+ if page_id == page_key:

+ return node[page_key]['title']

+ else:

+ result = self.get_page_title(page_id, node[page_key])

+ if result is not None:

+ return result

+ return None

+

+ def print_pages(self, top=None, level=0):

+ if top is None:

+ top = self.get_cfg_page()['root']

+ for node in top['child']:

+ page_id = next(iter(node))

+ print('%s%s: %s' % (' ' * level, page_id, node[page_id]['title']))

+ level += 1

+ self.print_pages(node[page_id], level)

+ level -= 1

+

+ def get_item_by_index(self, index):

+ return self._cfg_list[index]

+

+ def get_item_by_path(self, path):

+ node = self.locate_cfg_item(path)

+ if node:

+ return self.get_item_by_index(node['indx'])

+ else:

+ return None

+

+ def locate_cfg_path(self, item):

+ def _locate_cfg_path(root, level=0):

+ # config structure

+ if item is root:

+ return path

+ for key in root:

+ if type(root[key]) is OrderedDict:

+ level += 1

+ path.append(key)

+ ret = _locate_cfg_path(root[key], level)

+ if ret:

+ return ret

+ path.pop()

+ return None

+ path = []

+ return _locate_cfg_path(self._cfg_tree)

+

+ def locate_cfg_item(self, path, allow_exp=True):

+ def _locate_cfg_item(root, path, level=0):

+ if len(path) == level:

+ return root

+ next_root = root.get(path[level], None)

+ if next_root is None:

+ if allow_exp:

+ raise Exception('Not a valid CFG config option path: %s' %

+ '.'.join(path[:level+1]))

+ else:

+ return None

+ return _locate_cfg_item(next_root, path, level + 1)

+

+ path_nodes = path.split('.')

+ return _locate_cfg_item(self._cfg_tree, path_nodes)

+

+ def traverse_cfg_tree(self, handler, top=None):

+ def _traverse_cfg_tree(root, level=0):

+ # config structure

+ for key in root:

+ if type(root[key]) is OrderedDict:

+ level += 1

+ handler(key, root[key], level)

+ _traverse_cfg_tree(root[key], level)

+ level -= 1

+

+ if top is None:

+ top = self._cfg_tree

+ _traverse_cfg_tree(top)

+

+ def print_cfgs(self, root=None, short=True, print_level=256):

+ def _print_cfgs(name, cfgs, level):

+

+ if 'indx' in cfgs:

+ act_cfg = self.get_item_by_index(cfgs['indx'])

+ else:

+ offset = 0

+ length = 0

+ value = ''

+ if CGenYamlCfg.STRUCT in cfgs:

+ cfg = cfgs[CGenYamlCfg.STRUCT]

+ offset = int(cfg['offset'])

+ length = int(cfg['length'])

+ if 'value' in cfg:

+ value = cfg['value']

+ if length == 0:

+ return

+ act_cfg = dict({'value': value, 'offset': offset,

+ 'length': length})

+ value = act_cfg['value']

+ bit_len = act_cfg['length']

+ offset = (act_cfg['offset'] + 7) // 8

+ if value != '':

+ try:

+ value = self.reformat_value_str(act_cfg['value'],

+ act_cfg['length'])

+ except Exception:

+ value = act_cfg['value']

+ length = bit_len // 8

+ bit_len = '(%db)' % bit_len if bit_len % 8 else '' * 4

+ if level <= print_level:

+ if short and len(value) > 40:

+ value = '%s ... %s' % (value[:20], value[-20:])

+ print('%04X:%04X%-6s %s%s : %s' % (offset, length, bit_len,

+ ' ' * level, name, value))

+

+ self.traverse_cfg_tree(_print_cfgs)

+

+ def build_var_dict(self):

+ def _build_var_dict(name, cfgs, level):

+ if level <= 2:

+ if CGenYamlCfg.STRUCT in cfgs:

+ struct_info = cfgs[CGenYamlCfg.STRUCT]

+ self._var_dict['_LENGTH_%s_' % name] = struct_info[

+ 'length'] // 8

+ self._var_dict['_OFFSET_%s_' % name] = struct_info[

+ 'offset'] // 8

+

+ self._var_dict = {}

+ self.traverse_cfg_tree(_build_var_dict)

+ self._var_dict['_LENGTH_'] = self._cfg_tree[CGenYamlCfg.STRUCT][

+ 'length'] // 8

+ return 0

+

+ def add_cfg_page(self, child, parent, title=''):

+ def _add_cfg_page(cfg_page, child, parent):

+ key = next(iter(cfg_page))

+ if parent == key:

+ cfg_page[key]['child'].append({child: {'title': title,

+ 'child': []}})

+ return True

+ else:

+ result = False

+ for each in cfg_page[key]['child']:

+ if _add_cfg_page(each, child, parent):

+ result = True

+ break

+ return result

+

+ return _add_cfg_page(self._cfg_page, child, parent)

+

+ def set_cur_page(self, page_str):

+ if not page_str:

+ return

+

+ if ',' in page_str:

+ page_list = page_str.split(',')

+ else:

+ page_list = [page_str]

+ for page_str in page_list:

+ parts = page_str.split(':')

+ if len(parts) in [1, 3]:

+ page = parts[0].strip()

+ if len(parts) == 3:

+ # it is a new page definition, add it into tree

+ parent = parts[1] if parts[1] else 'root'

+ parent = parent.strip()

+ if parts[2][0] == '"' and parts[2][-1] == '"':

+ parts[2] = parts[2][1:-1]

+

+ if not self.add_cfg_page(page, parent, parts[2]):

+ raise SystemExit("Error: Cannot find parent page \

+'%s'!" % parent)

+ else:

+ raise SystemExit("Error: Invalid page format '%s' !"

+ % page_str)

+ self._cur_page = page

+

+ def extend_variable(self, line):

+ # replace all variables

+ if line == '':

+ return line

+ loop = 2

+ while loop > 0:

+ line_after = DefTemplate(line).safe_substitute(self._def_dict)

+ if line == line_after:

+ break

+ loop -= 1

+ line = line_after

+ return line_after

+

+ def reformat_number_per_type(self, itype, value):

+ if check_quote(value) or value.startswith('{'):

+ return value

+ parts = itype.split(',')

+ if len(parts) > 3 and parts[0] == 'EditNum':

+ num_fmt = parts[1].strip()

+ else:

+ num_fmt = ''

+ if num_fmt == 'HEX' and not value.startswith('0x'):

+ value = '0x%X' % int(value, 10)

+ elif num_fmt == 'DEC' and value.startswith('0x'):

+ value = '%d' % int(value, 16)

+ return value

+

+ def add_cfg_item(self, name, item, offset, path):

+

+ self.set_cur_page(item.get('page', ''))

+

+ if name[0] == '$':

+ # skip all virtual node

+ return 0

+

+ if not set(item).issubset(CGenYamlCfg.keyword_set):

+ for each in list(item):

+ if each not in CGenYamlCfg.keyword_set:

+ raise Exception("Invalid attribute '%s' for '%s'!" %

+ (each, '.'.join(path)))

+

+ length = item.get('length', 0)

+ if type(length) is str:

+ match = re.match("^(\\d+)([b|B|W|D|Q])([B|W|D|Q]?)\\s*$", length)

+ if match:

+ unit_len = CGenYamlCfg.bits_width[match.group(2)]

+ length = int(match.group(1), 10) * unit_len

+ else:

+ try:

+ length = int(length, 0) * 8

+ except Exception:

+ raise Exception("Invalid length field '%s' for '%s' !" %

+ (length, '.'.join(path)))

+

+ if offset % 8 > 0:

+ raise Exception("Invalid alignment for field '%s' for \

+'%s' !" % (name, '.'.join(path)))

+ else:

+ # define is length in bytes

+ length = length * 8

+

+ if not name.isidentifier():

+ raise Exception("Invalid config name '%s' for '%s' !" %

+ (name, '.'.join(path)))

+

+ itype = str(item.get('type', 'Reserved'))

+ value = str(item.get('value', ''))

+ if value:

+ if not (check_quote(value) or value.startswith('{')):

+ if ',' in value:

+ value = '{ %s }' % value

+ else:

+ value = self.reformat_number_per_type(itype, value)

+

+ help = str(item.get('help', ''))

+ if '\n' in help:

+ help = ' '.join([i.strip() for i in help.splitlines()])

+

+ option = str(item.get('option', ''))

+ if '\n' in option:

+ option = ' '.join([i.strip() for i in option.splitlines()])

+

+ # extend variables for value and condition

+ condition = str(item.get('condition', ''))

+ if condition:

+ condition = self.extend_variable(condition)

+ value = self.extend_variable(value)

+

+ order = str(item.get('order', ''))

+ if order:

+ if '.' in order:

+ (major, minor) = order.split('.')

+ order = int(major, 16)

+ else:

+ order = int(order, 16)

+ else:

+ order = offset

+

+ cfg_item = dict()

+ cfg_item['length'] = length

+ cfg_item['offset'] = offset

+ cfg_item['value'] = value

+ cfg_item['type'] = itype

+ cfg_item['cname'] = str(name)

+ cfg_item['name'] = str(item.get('name', ''))

+ cfg_item['help'] = help

+ cfg_item['option'] = option

+ cfg_item['page'] = self._cur_page

+ cfg_item['order'] = order

+ cfg_item['path'] = '.'.join(path)

+ cfg_item['condition'] = condition

+ if 'struct' in item:

+ cfg_item['struct'] = item['struct']

+ self._cfg_list.append(cfg_item)

+

+ item['indx'] = len(self._cfg_list) - 1

+

+ # remove used info for reducing pkl size

+ item.pop('option', None)

+ item.pop('condition', None)

+ item.pop('help', None)

+ item.pop('name', None)

+ item.pop('page', None)

+

+ return length

+

+ def build_cfg_list(self, cfg_name='', top=None, path=[],

+ info={'offset': 0}):

+ if top is None:

+ top = self._cfg_tree

+ info.clear()

+ info = {'offset': 0}

+

+ start = info['offset']

+ is_leaf = True

+ for key in top:

+ path.append(key)

+ if type(top[key]) is OrderedDict:

+ is_leaf = False

+ self.build_cfg_list(key, top[key], path, info)

+ path.pop()

+

+ if is_leaf:

+ length = self.add_cfg_item(cfg_name, top, info['offset'], path)

+ info['offset'] += length

+ elif cfg_name == '' or (cfg_name and cfg_name[0] != '$'):

+ # check first element for struct

+ first = next(iter(top))

+ struct_str = CGenYamlCfg.STRUCT

+ if first != struct_str:

+ struct_node = OrderedDict({})

+ top[struct_str] = struct_node

+ top.move_to_end(struct_str, False)

+ else:

+ struct_node = top[struct_str]

+ struct_node['offset'] = start

+ struct_node['length'] = info['offset'] - start

+ if struct_node['length'] % 8 != 0:

+ raise SystemExit("Error: Bits length not aligned for %s !" %

+ str(path))

+

+ def get_field_value(self, top=None):

+ def _get_field_value(name, cfgs, level):

+ if 'indx' in cfgs:

+ act_cfg = self.get_item_by_index(cfgs['indx'])

+ if act_cfg['length'] == 0:

+ return

+ value = self.get_value(act_cfg['value'], act_cfg['length'],

+ False)

+ set_bits_to_bytes(result, act_cfg['offset'] -

+ struct_info['offset'], act_cfg['length'],

+ value)

+

+ if top is None:

+ top = self._cfg_tree

+ struct_info = top[CGenYamlCfg.STRUCT]

+ result = bytearray((struct_info['length'] + 7) // 8)

+ self.traverse_cfg_tree(_get_field_value, top)

+ return result

+

+ def set_field_value(self, top, value_bytes, force=False):

+ def _set_field_value(name, cfgs, level):

+ if 'indx' not in cfgs:

+ return

+ act_cfg = self.get_item_by_index(cfgs['indx'])

+ if force or act_cfg['value'] == '':

+ value = get_bits_from_bytes(full_bytes,

+ act_cfg['offset'] -

+ struct_info['offset'],

+ act_cfg['length'])

+ act_val = act_cfg['value']

+ if act_val == '':

+ act_val = '%d' % value

+ act_val = self.reformat_number_per_type(act_cfg

+ ['type'],

+ act_val)

+ act_cfg['value'] = self.format_value_to_str(

+ value, act_cfg['length'], act_val)

+

+ if 'indx' in top:

+ # it is config option

+ value = bytes_to_value(value_bytes)

+ act_cfg = self.get_item_by_index(top['indx'])

+ act_cfg['value'] = self.format_value_to_str(

+ value, act_cfg['length'], act_cfg['value'])

+ else:

+ # it is structure

+ struct_info = top[CGenYamlCfg.STRUCT]

+ length = struct_info['length'] // 8

+ full_bytes = bytearray(value_bytes[:length])

+ if len(full_bytes) < length:

+ full_bytes.extend(bytearray(length - len(value_bytes)))

+ self.traverse_cfg_tree(_set_field_value, top)

+

+ def update_def_value(self):

+ def _update_def_value(name, cfgs, level):

+ if 'indx' in cfgs:

+ act_cfg = self.get_item_by_index(cfgs['indx'])

+ if act_cfg['value'] != '' and act_cfg['length'] > 0:

+ try:

+ act_cfg['value'] = self.reformat_value_str(

+ act_cfg['value'], act_cfg['length'])

+ except Exception:

+ raise Exception("Invalid value expression '%s' \

+for '%s' !" % (act_cfg['value'], act_cfg['path']))

+ else:

+ if CGenYamlCfg.STRUCT in cfgs and 'value' in \

+ cfgs[CGenYamlCfg.STRUCT]:

+ curr = cfgs[CGenYamlCfg.STRUCT]

+ value_bytes = self.get_value(curr['value'],

+ curr['length'], True)

+ self.set_field_value(cfgs, value_bytes)

+

+ self.traverse_cfg_tree(_update_def_value, self._cfg_tree)

+

+ def evaluate_condition(self, item):

+ expr = item['condition']

+ result = self.parse_value(expr, 1, False)

+ return result

+

+ def detect_fsp(self):

+ cfg_segs = self.get_cfg_segment()

+ if len(cfg_segs) == 3:

+ fsp = True

+ for idx, seg in enumerate(cfg_segs):

+ if not seg[0].endswith('UPD_%s' % 'TMS'[idx]):

+ fsp = False

+ break

+ else:

+ fsp = False

+ if fsp:

+ self.set_mode('FSP')

+ return fsp

+

+ def get_cfg_segment(self):

+ def _get_cfg_segment(name, cfgs, level):

+ if 'indx' not in cfgs:

+ if name.startswith('$ACTION_'):

+ if 'find' in cfgs:

+ find[0] = cfgs['find']

+ else:

+ if find[0]:

+ act_cfg = self.get_item_by_index(cfgs['indx'])

+ segments.append([find[0], act_cfg['offset'] // 8, 0])

+ find[0] = ''

+ return

+

+ find = ['']

+ segments = []

+ self.traverse_cfg_tree(_get_cfg_segment, self._cfg_tree)

+ cfg_len = self._cfg_tree[CGenYamlCfg.STRUCT]['length'] // 8

+ if len(segments) == 0:

+ segments.append(['', 0, cfg_len])

+

+ segments.append(['', cfg_len, 0])

+ cfg_segs = []

+ for idx, each in enumerate(segments[:-1]):

+ cfg_segs.append((each[0], each[1],

+ segments[idx+1][1] - each[1]))

+

+ return cfg_segs

+

+ def get_bin_segment(self, bin_data):

+ cfg_segs = self.get_cfg_segment()

+ bin_segs = []

+ for seg in cfg_segs:

+ key = seg[0].encode()

+ if key == 0:

+ bin_segs.append([seg[0], 0, len(bin_data)])

+ break

+ pos = bin_data.find(key)

+ if pos >= 0:

+ # ensure no other match for the key

+ next_pos = bin_data.find(key, pos + len(seg[0]))

+ if next_pos >= 0:

+ if key == b'$SKLFSP$' or key == b'$BSWFSP$':

+ string = ('Warning: Multiple matches for %s in '

+ 'binary!\n\nA workaround applied to such '

+ 'FSP 1.x binary to use second'

+ ' match instead of first match!' % key)

+ messagebox.showwarning('Warning!', string)

+ pos = next_pos

+ else:

+ print("Warning: Multiple matches for '%s' "

+ "in binary, the 1st instance will be used !"

+ % seg[0])

+ bin_segs.append([seg[0], pos, seg[2]])

+ else:

+ raise Exception("Could not find '%s' in binary !"

+ % seg[0])

+

+ return bin_segs

+

+ def extract_cfg_from_bin(self, bin_data):

+ # get cfg bin length

+ cfg_bins = bytearray()

+ bin_segs = self.get_bin_segment(bin_data)

+ for each in bin_segs:

+ cfg_bins.extend(bin_data[each[1]:each[1] + each[2]])

+ return cfg_bins

+

+ def save_current_to_bin(self):

+ cfg_bins = self.generate_binary_array()

+ if self._old_bin is None:

+ return cfg_bins

+

+ bin_data = bytearray(self._old_bin)

+ bin_segs = self.get_bin_segment(self._old_bin)

+ cfg_off = 0

+ for each in bin_segs:

+ length = each[2]

+ bin_data[each[1]:each[1] + length] = cfg_bins[cfg_off:

+ cfg_off

+ + length]

+ cfg_off += length

+ print('Patched the loaded binary successfully !')

+

+ return bin_data

+

+ def load_default_from_bin(self, bin_data):

+ self._old_bin = bin_data

+ cfg_bins = self.extract_cfg_from_bin(bin_data)

+ self.set_field_value(self._cfg_tree, cfg_bins, True)

+ return cfg_bins

+

+ def generate_binary_array(self, path=''):

+ if path == '':

+ top = None

+ else:

+ top = self.locate_cfg_item(path)

+ if not top:

+ raise Exception("Invalid configuration path '%s' !"

+ % path)

+ return self.get_field_value(top)

+

+ def generate_binary(self, bin_file_name, path=''):

+ bin_file = open(bin_file_name, "wb")

+ bin_file.write(self.generate_binary_array(path))

+ bin_file.close()

+ return 0

+

+ def write_delta_file(self, out_file, platform_id, out_lines):

+ dlt_fd = open(out_file, "w")

+ dlt_fd.write("%s\n" % get_copyright_header('dlt', True))

+ if platform_id is not None:

+ dlt_fd.write('#\n')

+ dlt_fd.write('# Delta configuration values for '

+ 'platform ID 0x%04X\n'

+ % platform_id)

+ dlt_fd.write('#\n\n')

+ for line in out_lines:

+ dlt_fd.write('%s\n' % line)

+ dlt_fd.close()

+

+ def override_default_value(self, dlt_file):

+ error = 0

+ dlt_lines = CGenYamlCfg.expand_include_files(dlt_file)

+

+ platform_id = None

+ for line, file_path, line_num in dlt_lines:

+ line = line.strip()

+ if not line or line.startswith('#'):

+ continue

+ match = re.match("\\s*([\\w\\.]+)\\s*\\|\\s*(.+)", line)

+ if not match:

+ raise Exception("Unrecognized line '%s' "

+ "(File:'%s' Line:%d) !"

+ % (line, file_path, line_num + 1))

+

+ path = match.group(1)

+ value_str = match.group(2)

+ top = self.locate_cfg_item(path)

+ if not top:

+ raise Exception(

+ "Invalid configuration '%s' (File:'%s' Line:%d) !" %

+ (path, file_path, line_num + 1))

+

+ if 'indx' in top:

+ act_cfg = self.get_item_by_index(top['indx'])

+ bit_len = act_cfg['length']

+ else:

+ struct_info = top[CGenYamlCfg.STRUCT]

+ bit_len = struct_info['length']

+

+ value_bytes = self.parse_value(value_str, bit_len)

+ self.set_field_value(top, value_bytes, True)

+

+ if path == 'PLATFORMID_CFG_DATA.PlatformId':

+ platform_id = value_str

+

+ if platform_id is None:

+ raise Exception(

+ "PLATFORMID_CFG_DATA.PlatformId is missing "

+ "in file '%s' !" %

+ (dlt_file))

+

+ return error

+

+ def generate_delta_file_from_bin(self, delta_file, old_data,

+ new_data, full=False):

+ new_data = self.load_default_from_bin(new_data)

+ lines = []

+ platform_id = None

+ def_platform_id = 0

+

+ for item in self._cfg_list:

+ if not full and (item['type'] in ['Reserved']):

+ continue

+ old_val = get_bits_from_bytes(old_data, item['offset'],

+ item['length'])

+ new_val = get_bits_from_bytes(new_data, item['offset'],

+ item['length'])

+

+ full_name = item['path']

+ if 'PLATFORMID_CFG_DATA.PlatformId' == full_name:

+ def_platform_id = old_val

+ if new_val != old_val or full:

+ val_str = self.reformat_value_str(item['value'],

+ item['length'])

+ text = '%-40s | %s' % (full_name, val_str)

+ lines.append(text)

+

+ if self.get_mode() != 'FSP':

+ if platform_id is None or def_platform_id == platform_id:

+ platform_id = def_platform_id

+ print("WARNING: 'PlatformId' configuration is "

+ "same as default %d!" % platform_id)

+

+ lines.insert(0, '%-40s | %s\n\n' %

+ ('PLATFORMID_CFG_DATA.PlatformId',

+ '0x%04X' % platform_id))

+ else:

+ platform_id = None

+

+ self.write_delta_file(delta_file, platform_id, lines)

+

+ return 0

+

+ def generate_delta_file(self, delta_file, bin_file, bin_file2, full=False):

+ fd = open(bin_file, 'rb')

+ new_data = self.extract_cfg_from_bin(bytearray(fd.read()))

+ fd.close()

+

+ if bin_file2 == '':

+ old_data = self.generate_binary_array()

+ else:

+ old_data = new_data

+ fd = open(bin_file2, 'rb')

+ new_data = self.extract_cfg_from_bin(bytearray(fd.read()))

+ fd.close()

+

+ return self.generate_delta_file_from_bin(delta_file,

+ old_data, new_data, full)

+

+ def prepare_marshal(self, is_save):

+ if is_save:

+ # Ordered dict is not marshallable, convert to list

+ self._cfg_tree = CGenYamlCfg.deep_convert_dict(self._cfg_tree)

+ else:

+ # Revert it back

+ self._cfg_tree = CGenYamlCfg.deep_convert_list(self._cfg_tree)

+

+ def generate_yml_file(self, in_file, out_file):

+ cfg_yaml = CFG_YAML()

+ text = cfg_yaml.expand_yaml(in_file)

+ yml_fd = open(out_file, "w")

+ yml_fd.write(text)

+ yml_fd.close()

+ return 0

+

+ def write_cfg_header_file(self, hdr_file_name, tag_mode,

+ tag_dict, struct_list):

+ lines = []

+ lines.append('\n\n')

+ if self.get_mode() == 'FSP':

+ lines.append('#include <FspUpd.h>\n')

+

+ tag_mode = tag_mode & 0x7F

+ tag_list = sorted(list(tag_dict.items()), key=lambda x: x[1])

+ for tagname, tagval in tag_list:

+ if (tag_mode == 0 and tagval >= 0x100) or \

+ (tag_mode == 1 and tagval < 0x100):

+ continue

+ lines.append('#define %-30s 0x%03X\n' % (

+ 'CDATA_%s_TAG' % tagname[:-9], tagval))

+ lines.append('\n\n')

+

+ name_dict = {}

+ new_dict = {}

+ for each in struct_list:

+ if (tag_mode == 0 and each['tag'] >= 0x100) or \

+ (tag_mode == 1 and each['tag'] < 0x100):

+ continue

+ new_dict[each['name']] = (each['alias'], each['count'])

+ if each['alias'] not in name_dict:

+ name_dict[each['alias']] = 1

+ lines.extend(self.create_struct(each['alias'],

+ each['node'], new_dict))

+ lines.append('#pragma pack()\n\n')

+

+ self.write_header_file(lines, hdr_file_name)

+

+ def write_header_file(self, txt_body, file_name, type='h'):

+ file_name_def = os.path.basename(file_name).replace('.', '_')

+ file_name_def = re.sub('(.)([A-Z][a-z]+)', r'\1_\2', file_name_def)

+ file_name_def = re.sub('([a-z0-9])([A-Z])', r'\1_\2',

+ file_name_def).upper()

+

+ lines = []

+ lines.append("%s\n" % get_copyright_header(type))

+ lines.append("#ifndef __%s__\n" % file_name_def)

+ lines.append("#define __%s__\n\n" % file_name_def)

+ if type == 'h':

+ lines.append("#pragma pack(1)\n\n")

+ lines.extend(txt_body)

+ if type == 'h':

+ lines.append("#pragma pack()\n\n")

+ lines.append("#endif\n")

+

+ # Don't rewrite if the contents are the same

+ create = True

+ if os.path.exists(file_name):

+ hdr_file = open(file_name, "r")

+ org_txt = hdr_file.read()

+ hdr_file.close()

+

+ new_txt = ''.join(lines)

+ if org_txt == new_txt:

+ create = False

+

+ if create:

+ hdr_file = open(file_name, "w")

+ hdr_file.write(''.join(lines))

+ hdr_file.close()

+

+ def generate_data_inc_file(self, dat_inc_file_name, bin_file=None):

+ # Put a prefix GUID before CFGDATA so that it can be located later on

+ prefix = b'\xa7\xbd\x7f\x73\x20\x1e\x46\xd6\

+xbe\x8f\x64\x12\x05\x8d\x0a\xa8'

+ if bin_file:

+ fin = open(bin_file, 'rb')

+ bin_dat = prefix + bytearray(fin.read())

+ fin.close()

+ else:

+ bin_dat = prefix + self.generate_binary_array()

+

+ file_name = os.path.basename(dat_inc_file_name).upper()

+ file_name = file_name.replace('.', '_')

+

+ txt_lines = []

+

+ txt_lines.append("UINT8 mConfigDataBlob[%d] = {\n" % len(bin_dat))

+ count = 0

+ line = [' ']

+ for each in bin_dat:

+ line.append('0x%02X, ' % each)

+ count = count + 1

+ if (count & 0x0F) == 0:

+ line.append('\n')

+ txt_lines.append(''.join(line))

+ line = [' ']

+ if len(line) > 1:

+ txt_lines.append(''.join(line) + '\n')

+

+ txt_lines.append("};\n\n")

+ self.write_header_file(txt_lines, dat_inc_file_name, 'inc')

+

+ return 0

+

+ def get_struct_array_info(self, input):

+ parts = input.split(':')

+ if len(parts) > 1:

+ var = parts[1]

+ input = parts[0]

+ else:

+ var = ''

+ array_str = input.split('[')

+ name = array_str[0]

+ if len(array_str) > 1:

+ num_str = ''.join(c for c in array_str[-1] if c.isdigit())

+ num_str = '1000' if len(num_str) == 0 else num_str

+ array_num = int(num_str)

+ else:

+ array_num = 0

+ return name, array_num, var

+

+ def process_multilines(self, string, max_char_length):

+ multilines = ''

+ string_length = len(string)

+ current_string_start = 0

+ string_offset = 0

+ break_line_dict = []

+ if len(string) <= max_char_length:

+ while (string_offset < string_length):

+ if string_offset >= 1:

+ if string[string_offset - 1] == '\\' and string[

+ string_offset] == 'n':

+ break_line_dict.append(string_offset + 1)

+ string_offset += 1

+ if break_line_dict != []:

+ for each in break_line_dict:

+ multilines += " %s\n" % string[

+ current_string_start:each].lstrip()

+ current_string_start = each

+ if string_length - current_string_start > 0:

+ multilines += " %s\n" % string[

+ current_string_start:].lstrip()

+ else:

+ multilines = " %s\n" % string

+ else:

+ new_line_start = 0

+ new_line_count = 0

+ found_space_char = False

+ while (string_offset < string_length):

+ if string_offset >= 1:

+ if new_line_count >= max_char_length - 1:

+ if string[string_offset] == ' ' and \

+ string_length - string_offset > 10:

+ break_line_dict.append(new_line_start

+ + new_line_count)

+ new_line_start = new_line_start + new_line_count

+ new_line_count = 0

+ found_space_char = True

+ elif string_offset == string_length - 1 and \

+ found_space_char is False:

+ break_line_dict.append(0)

+ if string[string_offset - 1] == '\\' and string[

+ string_offset] == 'n':

+ break_line_dict.append(string_offset + 1)

+ new_line_start = string_offset + 1

+ new_line_count = 0

+ string_offset += 1

+ new_line_count += 1

+ if break_line_dict != []:

+ break_line_dict.sort()

+ for each in break_line_dict:

+ if each > 0:

+ multilines += " %s\n" % string[

+ current_string_start:each].lstrip()

+ current_string_start = each

+ if string_length - current_string_start > 0:

+ multilines += " %s\n" % \

+ string[current_string_start:].lstrip()

+ return multilines

+

+ def create_field(self, item, name, length, offset, struct,

+ bsf_name, help, option, bits_length=None):

+ pos_name = 28

+ name_line = ''

+ # help_line = ''

+ # option_line = ''

+

+ if length == 0 and name == 'dummy':

+ return '\n'

+

+ if bits_length == 0:

+ return '\n'

+

+ is_array = False

+ if length in [1, 2, 4, 8]:

+ type = "UINT%d" % (length * 8)

+ else:

+ is_array = True

+ type = "UINT8"

+

+ if item and item['value'].startswith('{'):

+ type = "UINT8"

+ is_array = True

+

+ if struct != '':

+ struct_base = struct.rstrip('*')

+ name = '*' * (len(struct) - len(struct_base)) + name

+ struct = struct_base

+ type = struct

+ if struct in ['UINT8', 'UINT16', 'UINT32', 'UINT64']:

+ is_array = True

+ unit = int(type[4:]) // 8

+ length = length / unit

+ else:

+ is_array = False

+

+ if is_array:

+ name = name + '[%d]' % length

+

+ if len(type) < pos_name:

+ space1 = pos_name - len(type)

+ else:

+ space1 = 1

+

+ if bsf_name != '':

+ name_line = " %s\n" % bsf_name

+ else:

+ name_line = "N/A\n"

+

+ # if help != '':

+ # help_line = self.process_multilines(help, 80)

+

+ # if option != '':

+ # option_line = self.process_multilines(option, 80)

+

+ if offset is None:

+ offset_str = '????'

+ else:

+ offset_str = '0x%04X' % offset

+

+ if bits_length is None:

+ bits_length = ''

+ else:

+ bits_length = ' : %d' % bits_length

+

+ # return "\n/** %s%s%s**/\n %s%s%s%s;\n" % (name_line, help_line,

+ # option_line, type, ' ' * space1, name, bits_length)

+ return "\n /* Offset %s: %s */\n %s%s%s%s;\n" % (

+ offset_str, name_line.strip(), type, ' ' * space1,

+ name, bits_length)

+

+ def create_struct(self, cname, top, struct_dict):

+ index = 0

+ last = ''

+ lines = []

+ off_base = -1

+

+ if cname in struct_dict:

+ if struct_dict[cname][2]:

+ return []

+ lines.append('\ntypedef struct {\n')

+ for field in top:

+ if field[0] == '$':

+ continue

+

+ index += 1

+

+ t_item = top[field]

+ if 'indx' not in t_item:

+ if CGenYamlCfg.STRUCT not in top[field]:

+ continue

+

+ if struct_dict[field][1] == 0:

+ continue

+

+ append = True

+ struct_info = top[field][CGenYamlCfg.STRUCT]

+

+ if 'struct' in struct_info:

+ struct, array_num, var = self.get_struct_array_info(

+ struct_info['struct'])

+ if array_num > 0:

+ if last == struct:

+ append = False

+ last = struct

+ if var == '':

+ var = field

+

+ field = CGenYamlCfg.format_struct_field_name(

+ var, struct_dict[field][1])

+ else:

+ struct = struct_dict[field][0]

+ field = CGenYamlCfg.format_struct_field_name(

+ field, struct_dict[field][1])

+

+ if append:

+ offset = t_item['$STRUCT']['offset'] // 8

+ if off_base == -1:

+ off_base = offset

+ line = self.create_field(None, field, 0, 0, struct,

+ '', '', '')

+ lines.append(' %s' % line)

+ last = struct

+ continue

+

+ item = self.get_item_by_index(t_item['indx'])

+ if item['cname'] == 'CfgHeader' and index == 1 or \

+ (item['cname'] == 'CondValue' and index == 2):

+ continue

+

+ bit_length = None

+ length = (item['length'] + 7) // 8

+ match = re.match("^(\\d+)([b|B|W|D|Q])([B|W|D|Q]?)",

+ t_item['length'])

+ if match and match.group(2) == 'b':

+ bit_length = int(match.group(1))

+ if match.group(3) != '':

+ length = CGenYamlCfg.bits_width[match.group(3)] // 8

+ else:

+ length = 4

+ offset = item['offset'] // 8

+ if off_base == -1:

+ off_base = offset

+ struct = item.get('struct', '')

+ name = field

+ prompt = item['name']

+ help = item['help']

+ option = item['option']

+ line = self.create_field(item, name, length, offset, struct,

+ prompt, help, option, bit_length)

+ lines.append(' %s' % line)

+ last = struct

+

+ lines.append('\n} %s;\n\n' % cname)

+

+ return lines

+

+ def write_fsp_sig_header_file(self, hdr_file_name):

+ hdr_fd = open(hdr_file_name, 'w')

+ hdr_fd.write("%s\n" % get_copyright_header('h'))

+ hdr_fd.write("#ifndef __FSPUPD_H__\n"

+ "#define __FSPUPD_H__\n\n"

+ "#include <FspEas.h>\n\n"

+ "#pragma pack(1)\n\n")

+ lines = []

+ for fsp_comp in 'TMS':

+ top = self.locate_cfg_item('FSP%s_UPD' % fsp_comp)

+ if not top:

+ raise Exception('Could not find FSP UPD definition !')

+ bins = self.get_field_value(top)

+ lines.append("#define FSP%s_UPD_SIGNATURE"

+ " 0x%016X /* '%s' */\n\n"

+ % (fsp_comp, bytes_to_value(bins[:8]),

+ bins[:8].decode()))

+ hdr_fd.write(''.join(lines))

+ hdr_fd.write("#pragma pack()\n\n"

+ "#endif\n")

+ hdr_fd.close()

+

+ def create_header_file(self, hdr_file_name, com_hdr_file_name='', path=''):

+

+ def _build_header_struct(name, cfgs, level):

+ if CGenYamlCfg.STRUCT in cfgs:

+ if 'CfgHeader' in cfgs:

+ # collect CFGDATA TAG IDs

+ cfghdr = self.get_item_by_index(cfgs['CfgHeader']['indx'])

+ tag_val = array_str_to_value(cfghdr['value']) >> 20

+ tag_dict[name] = tag_val

+ if level == 1:

+ tag_curr[0] = tag_val

+ struct_dict[name] = (level, tag_curr[0], cfgs)

+ if path == 'FSP_SIG':

+ self.write_fsp_sig_header_file(hdr_file_name)

+ return

+ tag_curr = [0]

+ tag_dict = {}

+ struct_dict = {}

+

+ if path == '':

+ top = None

+ else:

+ top = self.locate_cfg_item(path)

+ if not top:

+ raise Exception("Invalid configuration path '%s' !" % path)

+ _build_header_struct(path, top, 0)

+ self.traverse_cfg_tree(_build_header_struct, top)

+

+ if tag_curr[0] == 0:

+ hdr_mode = 2

+ else:

+ hdr_mode = 1

+

+ if re.match('FSP[TMS]_UPD', path):

+ hdr_mode |= 0x80

+

+ # filter out the items to be built for tags and structures

+ struct_list = []

+ for each in struct_dict:

+ match = False

+ for check in CGenYamlCfg.exclude_struct:

+ if re.match(check, each):

+ match = True

+ if each in tag_dict:

+ if each not in CGenYamlCfg.include_tag:

+ del tag_dict[each]

+ break

+ if not match:

+ struct_list.append({'name': each, 'alias': '', 'count': 0,

+ 'level': struct_dict[each][0],

+ 'tag': struct_dict[each][1],

+ 'node': struct_dict[each][2]})

+

+ # sort by level so that the bottom level struct

+ # will be build first to satisfy dependencies

+ struct_list = sorted(struct_list, key=lambda x: x['level'],

+ reverse=True)

+

+ # Convert XXX_[0-9]+ to XXX as an array hint

+ for each in struct_list:

+ cfgs = each['node']

+ if 'struct' in cfgs['$STRUCT']:

+ each['alias'], array_num, var = self.get_struct_array_info(

+ cfgs['$STRUCT']['struct'])

+ else:

+ match = re.match('(\\w+)(_\\d+)', each['name'])

+ if match:

+ each['alias'] = match.group(1)

+ else:

+ each['alias'] = each['name']

+

+ # count items for array build

+ for idx, each in enumerate(struct_list):

+ if idx > 0:

+ last_struct = struct_list[idx-1]['node']['$STRUCT']

+ curr_struct = each['node']['$STRUCT']

+ if struct_list[idx-1]['alias'] == each['alias'] and \

+ curr_struct['length'] == last_struct['length'] and \

+ curr_struct['offset'] == last_struct['offset'] + \

+ last_struct['length']:

+ for idx2 in range(idx-1, -1, -1):

+ if struct_list[idx2]['count'] > 0:

+ struct_list[idx2]['count'] += 1

+ break

+ continue

+ each['count'] = 1

+

+ # generate common header

+ if com_hdr_file_name:

+ self.write_cfg_header_file(com_hdr_file_name, 0, tag_dict,

+ struct_list)

+

+ # generate platform header

+ self.write_cfg_header_file(hdr_file_name, hdr_mode, tag_dict,

+ struct_list)

+

+ return 0

+

+ def load_yaml(self, cfg_file):

+ cfg_yaml = CFG_YAML()

+ self.initialize()

+ self._cfg_tree = cfg_yaml.load_yaml(cfg_file)

+ self._def_dict = cfg_yaml.def_dict

+ self._yaml_path = os.path.dirname(cfg_file)

+ self.build_cfg_list()

+ self.build_var_dict()

+ self.update_def_value()

+ return 0

+

+

+def usage():

+ print('\n'.join([

+ "GenYamlCfg Version 0.50",

+ "Usage:",

+ " GenYamlCfg GENINC BinFile IncOutFile "

+ " [-D Macros]",

+

+ " GenYamlCfg GENPKL YamlFile PklOutFile "

+ " [-D Macros]",

+ " GenYamlCfg GENBIN YamlFile[;DltFile] BinOutFile "

+ " [-D Macros]",

+ " GenYamlCfg GENDLT YamlFile[;BinFile] DltOutFile "

+ " [-D Macros]",

+ " GenYamlCfg GENYML YamlFile YamlOutFile"

+ " [-D Macros]",

+ " GenYamlCfg GENHDR YamlFile HdrOutFile "

+ " [-D Macros]"

+ ]))

+

+

+def main():

+ # Parse the options and args

+ argc = len(sys.argv)

+ if argc < 4:

+ usage()

+ return 1

+

+ gen_cfg_data = CGenYamlCfg()

+ command = sys.argv[1].upper()

+ out_file = sys.argv[3]

+ if argc >= 5 and gen_cfg_data.parse_macros(sys.argv[4:]) != 0:

+ raise Exception("ERROR: Macro parsing failed !")

+

+ file_list = sys.argv[2].split(';')

+ if len(file_list) >= 2:

+ yml_file = file_list[0]

+ dlt_file = file_list[1]

+ elif len(file_list) == 1:

+ yml_file = file_list[0]

+ dlt_file = ''

+ else:

+ raise Exception("ERROR: Invalid parameter '%s' !" % sys.argv[2])

+ yml_scope = ''

+ if '@' in yml_file:

+ parts = yml_file.split('@')

+ yml_file = parts[0]

+ yml_scope = parts[1]

+

+ if command == "GENDLT" and yml_file.endswith('.dlt'):

+ # It needs to expand an existing DLT file

+ dlt_file = yml_file

+ lines = gen_cfg_data.expand_include_files(dlt_file)

+ write_lines(lines, out_file)

+ return 0

+

+ if command == "GENYML":

+ if not yml_file.lower().endswith('.yaml'):

+ raise Exception('Only YAML file is supported !')

+ gen_cfg_data.generate_yml_file(yml_file, out_file)

+ return 0

+

+ bin_file = ''

+ if (yml_file.lower().endswith('.bin')) and (command == "GENINC"):

+ # It is binary file

+ bin_file = yml_file

+ yml_file = ''

+

+ if bin_file:

+ gen_cfg_data.generate_data_inc_file(out_file, bin_file)

+ return 0

+

+ cfg_bin_file = ''

+ cfg_bin_file2 = ''

+ if dlt_file:

+ if command == "GENDLT":

+ cfg_bin_file = dlt_file

+ dlt_file = ''

+ if len(file_list) >= 3:

+ cfg_bin_file2 = file_list[2]

+

+ if yml_file.lower().endswith('.pkl'):

+ with open(yml_file, "rb") as pkl_file:

+ gen_cfg_data.__dict__ = marshal.load(pkl_file)

+ gen_cfg_data.prepare_marshal(False)

+

+ # Override macro definition again for Pickle file

+ if argc >= 5:

+ gen_cfg_data.parse_macros(sys.argv[4:])

+ else:

+ gen_cfg_data.load_yaml(yml_file)

+ if command == 'GENPKL':

+ gen_cfg_data.prepare_marshal(True)

+ with open(out_file, "wb") as pkl_file:

+ marshal.dump(gen_cfg_data.__dict__, pkl_file)

+ json_file = os.path.splitext(out_file)[0] + '.json'

+ fo = open(json_file, 'w')

+ path_list = []

+ cfgs = {'_cfg_page': gen_cfg_data._cfg_page,

+ '_cfg_list': gen_cfg_data._cfg_list,

+ '_path_list': path_list}

+ # optimize to reduce size

+ path = None

+ for each in cfgs['_cfg_list']:

+ new_path = each['path'][:-len(each['cname'])-1]

+ if path != new_path:

+ path = new_path

+ each['path'] = path

+ path_list.append(path)

+ else:

+ del each['path']

+ if each['order'] == each['offset']:

+ del each['order']

+ del each['offset']

+

+ # value is just used to indicate display type

+ value = each['value']

+ if value.startswith('0x'):

+ hex_len = ((each['length'] + 7) // 8) * 2

+ if len(value) == hex_len:

+ value = 'x%d' % hex_len

+ else:

+ value = 'x'

+ each['value'] = value

+ elif value and value[0] in ['"', "'", '{']:

+ each['value'] = value[0]

+ else:

+ del each['value']

+

+ fo.write(repr(cfgs))

+ fo.close()

+ return 0

+

+ if dlt_file:

+ gen_cfg_data.override_default_value(dlt_file)

+

+ gen_cfg_data.detect_fsp()

+

+ if command == "GENBIN":

+ if len(file_list) == 3:

+ old_data = gen_cfg_data.generate_binary_array()

+ fi = open(file_list[2], 'rb')

+ new_data = bytearray(fi.read())

+ fi.close()

+ if len(new_data) != len(old_data):

+ raise Exception("Binary file '%s' length does not match, \

+ignored !" % file_list[2])

+ else:

+ gen_cfg_data.load_default_from_bin(new_data)

+ gen_cfg_data.override_default_value(dlt_file)

+

+ gen_cfg_data.generate_binary(out_file, yml_scope)

+

+ elif command == "GENDLT":

+ full = True if 'FULL' in gen_cfg_data._macro_dict else False

+ gen_cfg_data.generate_delta_file(out_file, cfg_bin_file,

+ cfg_bin_file2, full)

+

+ elif command == "GENHDR":

+ out_files = out_file.split(';')

+ brd_out_file = out_files[0].strip()

+ if len(out_files) > 1:

+ com_out_file = out_files[1].strip()

+ else:

+ com_out_file = ''

+ gen_cfg_data.create_header_file(brd_out_file, com_out_file, yml_scope)

+

+ elif command == "GENINC":

+ gen_cfg_data.generate_data_inc_file(out_file)

+

+ elif command == "DEBUG":

+ gen_cfg_data.print_cfgs()

+

+ else:

+ raise Exception("Unsuported command '%s' !" % command)

+

+ return 0

+

+

+if __name__ == '__main__':

+ sys.exit(main())

diff --git a/IntelFsp2Pkg/Tools/ConfigEditor/SingleSign.py
b/IntelFsp2Pkg/Tools/ConfigEditor/SingleSign.py
new file mode 100644
index 0000000000..7e008aa68a
--- /dev/null
+++ b/IntelFsp2Pkg/Tools/ConfigEditor/SingleSign.py
@@ -0,0 +1,324 @@
+#!/usr/bin/env python

+# @ SingleSign.py

+# Single signing script

+#

+# Copyright (c) 2020 - 2021, Intel Corporation. All rights reserved.<BR>

+# SPDX-License-Identifier: BSD-2-Clause-Patent

+#

+##

+

+import os

+import sys

+import re

+import shutil

+import subprocess

+

+SIGNING_KEY = {

+ # Key Id | Key File Name start |

+ #
=================================================================

+ # KEY_ID_MASTER is used for signing Slimboot Key Hash Manifest \

+ # container (KEYH Component)

+ "KEY_ID_MASTER_RSA2048": "MasterTestKey_Priv_RSA2048.pem",

+ "KEY_ID_MASTER_RSA3072": "MasterTestKey_Priv_RSA3072.pem",

+

+ # KEY_ID_CFGDATA is used for signing external Config data blob)

+ "KEY_ID_CFGDATA_RSA2048": "ConfigTestKey_Priv_RSA2048.pem",

+ "KEY_ID_CFGDATA_RSA3072": "ConfigTestKey_Priv_RSA3072.pem",

+

+ # KEY_ID_FIRMWAREUPDATE is used for signing capsule firmware update
image)

+ "KEY_ID_FIRMWAREUPDATE_RSA2048":
"FirmwareUpdateTestKey_Priv_RSA2048.pem",

+ "KEY_ID_FIRMWAREUPDATE_RSA3072":
"FirmwareUpdateTestKey_Priv_RSA3072.pem",

+

+ # KEY_ID_CONTAINER is used for signing container header with mono
signature

+ "KEY_ID_CONTAINER_RSA2048": "ContainerTestKey_Priv_RSA2048.pem",

+ "KEY_ID_CONTAINER_RSA3072": "ContainerTestKey_Priv_RSA3072.pem",

+

+ # CONTAINER_COMP1_KEY_ID is used for signing container components

+ "KEY_ID_CONTAINER_COMP_RSA2048":
"ContainerCompTestKey_Priv_RSA2048.pem",

+ "KEY_ID_CONTAINER_COMP_RSA3072":
"ContainerCompTestKey_Priv_RSA3072.pem",

+

+ # KEY_ID_OS1_PUBLIC, KEY_ID_OS2_PUBLIC is used for referencing \

+ # Boot OS public keys

+ "KEY_ID_OS1_PUBLIC_RSA2048": "OS1_TestKey_Pub_RSA2048.pem",

+ "KEY_ID_OS1_PUBLIC_RSA3072": "OS1_TestKey_Pub_RSA3072.pem",

+

+ "KEY_ID_OS2_PUBLIC_RSA2048": "OS2_TestKey_Pub_RSA2048.pem",

+ "KEY_ID_OS2_PUBLIC_RSA3072": "OS2_TestKey_Pub_RSA3072.pem",

+

+ }

+

+MESSAGE_SBL_KEY_DIR = """!!! PRE-REQUISITE: Path to SBL_KEY_DIR has.

+to be set with SBL KEYS DIRECTORY !!! \n!!! Generate keys.

+using GenerateKeys.py available in BootloaderCorePkg/Tools.

+directory !!! \n !!! Run $python.

+BootloaderCorePkg/Tools/GenerateKeys.py -k $PATH_TO_SBL_KEY_DIR !!!\n

+!!! Set SBL_KEY_DIR environ with path to SBL KEYS DIR !!!\n"

+!!! Windows $set SBL_KEY_DIR=$PATH_TO_SBL_KEY_DIR !!!\n

+!!! Linux $export SBL_KEY_DIR=$PATH_TO_SBL_KEY_DIR !!!\n"""

+

+

+def get_openssl_path():

+ if os.name == 'nt':

+ if 'OPENSSL_PATH' not in os.environ:

+ openssl_dir = "C:\\Openssl\\bin\\"

+ if os.path.exists(openssl_dir):

+ os.environ['OPENSSL_PATH'] = openssl_dir

+ else:

+ os.environ['OPENSSL_PATH'] = "C:\\Openssl\\"

+ if 'OPENSSL_CONF' not in os.environ:

+ openssl_cfg = "C:\\Openssl\\openssl.cfg"

+ if os.path.exists(openssl_cfg):

+ os.environ['OPENSSL_CONF'] = openssl_cfg

+ openssl = os.path.join(

+ os.environ.get('OPENSSL_PATH', ''),

+ 'openssl.exe')

+ else:

+ # Get openssl path for Linux cases

+ openssl = shutil.which('openssl')

+

+ return openssl

+

+

+def run_process(arg_list, print_cmd=False, capture_out=False):

+ sys.stdout.flush()

+ if print_cmd:

+ print(' '.join(arg_list))

+

+ exc = None

+ result = 0

+ output = ''

+ try:

+ if capture_out:

+ output = subprocess.check_output(arg_list).decode()

+ else:

+ result = subprocess.call(arg_list)

+ except Exception as ex:

+ result = 1

+ exc = ex

+

+ if result:

+ if not print_cmd:

+ print('Error in running process:\n %s' % ' '.join(arg_list))

+ if exc is None:

+ sys.exit(1)

+ else:

+ raise exc

+

+ return output

+

+

+def check_file_pem_format(priv_key):

+ # Check for file .pem format

+ key_name = os.path.basename(priv_key)

+ if os.path.splitext(key_name)[1] == ".pem":

+ return True

+ else:

+ return False

+

+

+def get_key_id(priv_key):

+ # Extract base name if path is provided.

+ key_name = os.path.basename(priv_key)

+ # Check for KEY_ID in key naming.

+ if key_name.startswith('KEY_ID'):

+ return key_name

+ else:

+ return None

+

+

+def get_sbl_key_dir():

+ # Check Key store setting SBL_KEY_DIR path

+ if 'SBL_KEY_DIR' not in os.environ:

+ exception_string = "ERROR: SBL_KEY_DIR is not defined." \

+ " Set SBL_KEY_DIR with SBL Keys directory!!\n"

+ raise Exception(exception_string + MESSAGE_SBL_KEY_DIR)

+

+ sbl_key_dir = os.environ.get('SBL_KEY_DIR')

+ if not os.path.exists(sbl_key_dir):

+ exception_string = "ERROR:SBL_KEY_DIR set " + sbl_key_dir \

+ + " is not valid." \

+ " Set the correct SBL_KEY_DIR path !!\n" \

+ + MESSAGE_SBL_KEY_DIR

+ raise Exception(exception_string)

+ else:

+ return sbl_key_dir

+

+

+def get_key_from_store(in_key):

+

+ # Check in_key is path to key

+ if os.path.exists(in_key):

+ return in_key

+

+ # Get Slimboot key dir path

+ sbl_key_dir = get_sbl_key_dir()

+

+ # Extract if in_key is key_id

+ priv_key = get_key_id(in_key)

+ if priv_key is not None:

+ if (priv_key in SIGNING_KEY):

+ # Generate key file name from key id

+ priv_key_file = SIGNING_KEY[priv_key]

+ else:

+ exception_string = "KEY_ID" + priv_key + "is not found " \

+ "is not found in supported KEY IDs!!"

+ raise Exception(exception_string)

+ elif check_file_pem_format(in_key):

+ # check if file name is provided in pem format

+ priv_key_file = in_key

+ else:

+ priv_key_file = None

+ raise Exception('key provided %s is not valid!' % in_key)

+

+ # Create a file path

+ # Join Key Dir and priv_key_file

+ try:

+ priv_key = os.path.join(sbl_key_dir, priv_key_file)

+ except Exception:

+ raise Exception('priv_key is not found %s!' % priv_key)

+

+ # Check for priv_key construted based on KEY ID exists in specified path

+ if not os.path.isfile(priv_key):

+ exception_string = "!!! ERROR: Key file corresponding to" \

+ + in_key + "do not exist in Sbl key " \

+ "directory at" + sbl_key_dir + "!!! \n" \

+ + MESSAGE_SBL_KEY_DIR

+ raise Exception(exception_string)

+

+ return priv_key

+

+#

+# Sign an file using openssl

+#

+# priv_key [Input] Key Id or Path to Private key

+# hash_type [Input] Signing hash

+# sign_scheme[Input] Sign/padding scheme

+# in_file [Input] Input file to be signed

+# out_file [Input/Output] Signed data file

+#

+

+

+def single_sign_file(priv_key, hash_type, sign_scheme, in_file, out_file):

+

+ _hash_type_string = {

+ "SHA2_256": 'sha256',

+ "SHA2_384": 'sha384',

+ "SHA2_512": 'sha512',

+ }

+

+ _hash_digest_Size = {

+ # Hash_string : Hash_Size

+ "SHA2_256": 32,

+ "SHA2_384": 48,

+ "SHA2_512": 64,

+ "SM3_256": 32,

+ }

+

+ _sign_scheme_string = {

+ "RSA_PKCS1": 'pkcs1',

+ "RSA_PSS": 'pss',

+ }

+

+ priv_key = get_key_from_store(priv_key)

+

+ # Temporary files to store hash generated

+ hash_file_tmp = out_file+'.hash.tmp'

+ hash_file = out_file+'.hash'

+

+ # Generate hash using openssl dgst in hex format

+ cmdargs = [get_openssl_path(),

+ 'dgst',

+ '-'+'%s' % _hash_type_string[hash_type],

+ '-out', '%s' % hash_file_tmp, '%s' % in_file]

+ run_process(cmdargs)

+

+ # Extract hash form dgst command output and convert to ascii

+ with open(hash_file_tmp, 'r') as fin:

+ hashdata = fin.read()

+ fin.close()

+

+ try:

+ hashdata = hashdata.rsplit('=', 1)[1].strip()

+ except Exception:

+ raise Exception('Hash Data not found for signing!')

+

+ if len(hashdata) != (_hash_digest_Size[hash_type] * 2):

+ raise Exception('Hash Data size do match with for hash type!')

+

+ hashdata_bytes = bytearray.fromhex(hashdata)

+ open(hash_file, 'wb').write(hashdata_bytes)

+

+ print("Key used for Singing %s !!" % priv_key)

+

+ # sign using Openssl pkeyutl

+ cmdargs = [get_openssl_path(),

+ 'pkeyutl', '-sign', '-in', '%s' % hash_file,

+ '-inkey', '%s' % priv_key, '-out',

+ '%s' % out_file, '-pkeyopt',

+ 'digest:%s' % _hash_type_string[hash_type],

+ '-pkeyopt', 'rsa_padding_mode:%s' %

+ _sign_scheme_string[sign_scheme]]

+

+ run_process(cmdargs)

+

+ return

+

+#

+# Extract public key using openssl

+#

+# in_key [Input] Private key or public key in pem format

+# pub_key_file [Input/Output] Public Key to a file

+#

+# return keydata (mod, exp) in bin format

+#

+

+

+def single_sign_gen_pub_key(in_key, pub_key_file=None):

+

+ in_key = get_key_from_store(in_key)

+

+ # Expect key to be in PEM format

+ is_prv_key = False

+ cmdline = [get_openssl_path(), 'rsa', '-pubout', '-text', '-noout',

+ '-in', '%s' % in_key]

+ # Check if it is public key or private key

+ text = open(in_key, 'r').read()

+ if '-BEGIN RSA PRIVATE KEY-' in text:

+ is_prv_key = True

+ elif '-BEGIN PUBLIC KEY-' in text:

+ cmdline.extend(['-pubin'])

+ else:

+ raise Exception('Unknown key format "%s" !' % in_key)

+

+ if pub_key_file:

+ cmdline.extend(['-out', '%s' % pub_key_file])

+ capture = False

+ else:

+ capture = True

+

+ output = run_process(cmdline, capture_out=capture)

+ if not capture:

+ output = text = open(pub_key_file, 'r').read()

+ data = output.replace('\r', '')

+ data = data.replace('\n', '')

+ data = data.replace(' ', '')

+

+ # Extract the modulus

+ if is_prv_key:

+ match = re.search('modulus(.*)publicExponent:\\s+(\\d+)\\s+', data)

+ else:

+ match = re.search('Modulus(?:.*?):(.*)Exponent:\\s+(\\d+)\\s+', data)

+ if not match:

+ raise Exception('Public key not found!')

+ modulus = match.group(1).replace(':', '')

+ exponent = int(match.group(2))

+

+ mod = bytearray.fromhex(modulus)

+ # Remove the '00' from the front if the MSB is 1

+ if mod[0] == 0 and (mod[1] & 0x80):

+ mod = mod[1:]

+ exp = bytearray.fromhex('{:08x}'.format(exponent))

+

+ keydata = mod + exp

+

+ return keydata

diff --git a/IntelFsp2Pkg/Tools/FspDscBsf2Yaml.py
b/IntelFsp2Pkg/Tools/FspDscBsf2Yaml.py
index d2ca7145ae..c64b50404d 100644
--- a/IntelFsp2Pkg/Tools/FspDscBsf2Yaml.py
+++ b/IntelFsp2Pkg/Tools/FspDscBsf2Yaml.py
@@ -1,8 +1,7 @@
#!/usr/bin/env python

-## @ FspDscBsf2Yaml.py

-# This script convert DSC or BSF format file into YAML format

-#

-# Copyright(c) 2021, Intel Corporation. All rights reserved.<BR>

+# @ FspBsf2Dsc.py

+# This script convert FSP BSF format into DSC format

+# Copyright (c) 2020 - 2021, Intel Corporation. All rights reserved.<BR>

# SPDX-License-Identifier: BSD-2-Clause-Patent

#

##

@@ -10,277 +9,38 @@
import os

import re

import sys

-from datetime import date

+

from collections import OrderedDict

-from functools import reduce

+from datetime import date



-from GenCfgOpt import CGenCfgOpt

+from FspGenCfgData import CFspBsf2Dsc, CGenCfgData



__copyright_tmp__ = """## @file

#

-# YAML CFGDATA %s File.

-#

-# Copyright(c) %4d, Intel Corporation. All rights reserved.<BR>

-# SPDX-License-Identifier: BSD-2-Clause-Patent

-#

-##

-"""

-

-__copyright_dsc__ = """## @file

+# Slim Bootloader CFGDATA %s File.

#

-# Copyright (c) %04d, Intel Corporation. All rights reserved.<BR>

+# Copyright (c) %4d, Intel Corporation. All rights reserved.<BR>

# SPDX-License-Identifier: BSD-2-Clause-Patent

#

##

-

-[PcdsDynamicVpd.Upd]

- #

- # Global definitions in BSF

- # !BSF BLOCK:{NAME:"FSP UPD Configuration", VER:"0.1"}

- #

-

"""





-def Bytes2Val(Bytes):

- return reduce(lambda x, y: (x << 8) | y, Bytes[::-1])

-

-

-def Str2Bytes(Value, Blen):

- Result = bytearray(Value[1:-1], 'utf-8') # Excluding quotes

- if len(Result) < Blen:

- Result.extend(b'\x00' * (Blen - len(Result)))

- return Result

-

-

-class CFspBsf2Dsc:

-

- def __init__(self, bsf_file):

- self.cfg_list = CFspBsf2Dsc.parse_bsf(bsf_file)

-

- def get_dsc_lines(self):

- return CFspBsf2Dsc.generate_dsc(self.cfg_list)

-

- def save_dsc(self, dsc_file):

- return CFspBsf2Dsc.generate_dsc(self.cfg_list, dsc_file)

-

- @staticmethod

- def parse_bsf(bsf_file):

-

- fd = open(bsf_file, 'r')

- bsf_txt = fd.read()

- fd.close()

-

- find_list = []

- regex = re.compile(r'\s+Find\s+"(.*?)"(.*?)^\s+\$(.*?)\s+', re.S |
re.MULTILINE)

- for match in regex.finditer(bsf_txt):

- find = match.group(1)

- name = match.group(3)

- if not name.endswith('_Revision'):

- raise Exception("Unexpected CFG item following 'Find' !")

- find_list.append((name, find))

-

- idx = 0

- count = 0

- prefix = ''

- chk_dict = {}

- cfg_list = []

- cfg_temp = {'find': '', 'cname': '', 'length': 0, 'value': '0', 'type': 'Reserved',

- 'embed': '', 'page': '', 'option': '', 'instance': 0}

- regex =
re.compile(r'^\s+(\$(.*?)|Skip)\s+(\d+)\s+bytes(\s+\$_DEFAULT_\s+=\s+(.+?))?$',

- re.S | re.MULTILINE)

-

- for match in regex.finditer(bsf_txt):

- dlen = int(match.group(3))

- if match.group(1) == 'Skip':

- key = 'gPlatformFspPkgTokenSpaceGuid_BsfSkip%d' % idx

- val = ', '.join(['%02X' % ord(i) for i in '\x00' * dlen])

- idx += 1

- option = '$SKIP'

- else:

- key = match.group(2)

- val = match.group(5)

- option = ''

-

- cfg_item = dict(cfg_temp)

- finds = [i for i in find_list if i[0] == key]

- if len(finds) > 0:

- if count >= 1:

- # Append a dummy one

- cfg_item['cname'] = 'Dummy'

- cfg_list.append(dict(cfg_item))

- cfg_list[-1]['embed'] = '%s:TAG_%03X:END' % (prefix, ord(prefix[-1]))

- prefix = finds[0][1]

- cfg_item['embed'] = '%s:TAG_%03X:START' % (prefix, ord(prefix[-1]))

- cfg_item['find'] = prefix

- cfg_item['cname'] = 'Signature'

- cfg_item['length'] = len(finds[0][1])

- str2byte = Str2Bytes("'" + finds[0][1] + "'", len(finds[0][1]))

- cfg_item['value'] = '0x%X' % Bytes2Val(str2byte)

- cfg_list.append(dict(cfg_item))

- cfg_item = dict(cfg_temp)

- find_list.pop(0)

- count = 0

-

- cfg_item['cname'] = key

- cfg_item['length'] = dlen

- cfg_item['value'] = val

- cfg_item['option'] = option

-

- if key not in chk_dict.keys():

- chk_dict[key] = 0

- else:

- chk_dict[key] += 1

- cfg_item['instance'] = chk_dict[key]

-

- cfg_list.append(cfg_item)

- count += 1

-

- if prefix:

- cfg_item = dict(cfg_temp)

- cfg_item['cname'] = 'Dummy'

- cfg_item['embed'] = '%s:%03X:END' % (prefix, ord(prefix[-1]))

- cfg_list.append(cfg_item)

-

- option_dict = {}

- selreg = re.compile(r'\s+Selection\s*(.+?)\s*,\s*"(.*?)"$', re.S |
re.MULTILINE)

- regex = re.compile(r'^List\s&(.+?)$(.+?)^EndList$', re.S | re.MULTILINE)

- for match in regex.finditer(bsf_txt):

- key = match.group(1)

- option_dict[key] = []

- for select in selreg.finditer(match.group(2)):

- option_dict[key].append((int(select.group(1), 0), select.group(2)))

-

- chk_dict = {}

- pagereg = re.compile(r'^Page\s"(.*?)"$(.+?)^EndPage$', re.S |
re.MULTILINE)

- for match in pagereg.finditer(bsf_txt):

- page = match.group(1)

- for line in match.group(2).splitlines():

- match = re.match(r'\s+(Combo|EditNum)\s\$(.+?),\s"(.*?)",\s(.+?),$',
line)

- if match:

- cname = match.group(2)

- if cname not in chk_dict.keys():

- chk_dict[cname] = 0

- else:

- chk_dict[cname] += 1

- instance = chk_dict[cname]

- cfg_idxs = [i for i, j in enumerate(cfg_list) if j['cname'] == cname and
j['instance'] == instance]

- if len(cfg_idxs) != 1:

- raise Exception("Multiple CFG item '%s' found !" % cname)

- cfg_item = cfg_list[cfg_idxs[0]]

- cfg_item['page'] = page

- cfg_item['type'] = match.group(1)

- cfg_item['prompt'] = match.group(3)

- cfg_item['range'] = None

- if cfg_item['type'] == 'Combo':

- cfg_item['option'] = option_dict[match.group(4)[1:]]

- elif cfg_item['type'] == 'EditNum':

- cfg_item['option'] = match.group(4)

- match = re.match(r'\s+ Help\s"(.*?)"$', line)

- if match:

- cfg_item['help'] = match.group(1)

-

- match = re.match(r'\s+"Valid\srange:\s(.*)"$', line)

- if match:

- parts = match.group(1).split()

- cfg_item['option'] = (

- (int(parts[0], 0), int(parts[2], 0), cfg_item['option']))

-

- return cfg_list

-

- @staticmethod

- def generate_dsc(option_list, dsc_file=None):

- dsc_lines = []

- header = '%s' % (__copyright_dsc__ % date.today().year)

- dsc_lines.extend(header.splitlines())

-

- pages = []

- for cfg_item in option_list:

- if cfg_item['page'] and (cfg_item['page'] not in pages):

- pages.append(cfg_item['page'])

-

- page_id = 0

- for page in pages:

- dsc_lines.append(' # !BSF PAGES:{PG%02X::"%s"}' % (page_id, page))

- page_id += 1

- dsc_lines.append('')

-

- last_page = ''

- for option in option_list:

- dsc_lines.append('')

- default = option['value']

- pos = option['cname'].find('_')

- name = option['cname'][pos + 1:]

-

- if option['find']:

- dsc_lines.append(' # !BSF FIND:{%s}' % option['find'])

- dsc_lines.append('')

-

- if option['instance'] > 0:

- name = name + '_%s' % option['instance']

-

- if option['embed']:

- dsc_lines.append(' # !HDR EMBED:{%s}' % option['embed'])

-

- if option['type'] == 'Reserved':

- dsc_lines.append(' # !BSF NAME:{Reserved} TYPE:{Reserved}')

- if option['option'] == '$SKIP':

- dsc_lines.append(' # !BSF OPTION:{$SKIP}')

- else:

- prompt = option['prompt']

-

- if last_page != option['page']:

- last_page = option['page']

- dsc_lines.append(' # !BSF PAGE:{PG%02X}' %
(pages.index(option['page'])))

-

- if option['type'] == 'Combo':

- dsc_lines.append(' # !BSF NAME:{%s} TYPE:{%s}' %

- (prompt, option['type']))

- ops = []

- for val, text in option['option']:

- ops.append('0x%x:%s' % (val, text))

- dsc_lines.append(' # !BSF OPTION:{%s}' % (', '.join(ops)))

- elif option['type'] == 'EditNum':

- cfg_len = option['length']

- if ',' in default and cfg_len > 8:

- dsc_lines.append(' # !BSF NAME:{%s} TYPE:{Table}' % (prompt))

- if cfg_len > 16:

- cfg_len = 16

- ops = []

- for i in range(cfg_len):

- ops.append('%X:1:HEX' % i)

- dsc_lines.append(' # !BSF OPTION:{%s}' % (', '.join(ops)))

- else:

- dsc_lines.append(

- ' # !BSF NAME:{%s} TYPE:{%s, %s,(0x%X, 0x%X)}' %

- (prompt, option['type'], option['option'][2],

- option['option'][0], option['option'][1]))

- dsc_lines.append(' # !BSF HELP:{%s}' % option['help'])

-

- if ',' in default:

- default = '{%s}' % default

- dsc_lines.append(' gCfgData.%-30s | * | 0x%04X | %s' %

- (name, option['length'], default))

-

- if dsc_file:

- fd = open(dsc_file, 'w')

- fd.write('\n'.join(dsc_lines))

- fd.close()

-

- return dsc_lines

-

-

class CFspDsc2Yaml():



def __init__(self):

self._Hdr_key_list = ['EMBED', 'STRUCT']

- self._Bsf_key_list = ['NAME', 'HELP', 'TYPE', 'PAGE', 'PAGES', 'OPTION',

- 'CONDITION', 'ORDER', 'MARKER', 'SUBT', 'FIELD', 'FIND']

+ self._Bsf_key_list = ['NAME', 'HELP', 'TYPE', 'PAGE', 'PAGES',

+ 'OPTION', 'CONDITION', 'ORDER', 'MARKER',

+ 'SUBT', 'FIELD', 'FIND']

self.gen_cfg_data = None

- self.cfg_reg_exp = re.compile(r"^([_a-zA-Z0-9$\(\)]+)\s*\|\s*(0x[0-9A-
F]+|\*)\s*\|"

- + r"\s*(\d+|0x[0-9a-fA-F]+)\s*\|\s*(.+)")

- self.bsf_reg_exp = re.compile(r"(%s):{(.+?)}(?:$|\s+)" %
'|'.join(self._Bsf_key_list))

- self.hdr_reg_exp = re.compile(r"(%s):{(.+?)}" % '|'.join(self._Hdr_key_list))

+ self.cfg_reg_exp = re.compile(

+ "^([_a-zA-Z0-9$\\(\\)]+)\\s*\\|\\s*(0x[0-9A-F]+|\\*)"

+ "\\s*\\|\\s*(\\d+|0x[0-9a-fA-F]+)\\s*\\|\\s*(.+)")

+ self.bsf_reg_exp = re.compile("(%s):{(.+?)}(?:$|\\s+)"

+ % '|'.join(self._Bsf_key_list))

+ self.hdr_reg_exp = re.compile("(%s):{(.+?)}"

+ % '|'.join(self._Hdr_key_list))

self.prefix = ''

self.unused_idx = 0

self.offset = 0

@@ -290,15 +50,15 @@ class CFspDsc2Yaml():
"""

Load and parse a DSC CFGDATA file.

"""

- gen_cfg_data = CGenCfgOpt('FSP')

+ gen_cfg_data = CGenCfgData('FSP')

if file_name.endswith('.dsc'):

- # if gen_cfg_data.ParseDscFileYaml(file_name, '') != 0:

- if gen_cfg_data.ParseDscFile(file_name, '') != 0:

+ if gen_cfg_data.ParseDscFile(file_name) != 0:

raise Exception('DSC file parsing error !')

if gen_cfg_data.CreateVarDict() != 0:

raise Exception('DSC variable creation error !')

else:

raise Exception('Unsupported file "%s" !' % file_name)

+ gen_cfg_data.UpdateDefaultValue()

self.gen_cfg_data = gen_cfg_data



def print_dsc_line(self):

@@ -312,14 +72,15 @@ class CFspDsc2Yaml():
"""

Format a CFGDATA item into YAML format.

"""

- if(not text.startswith('!expand')) and (': ' in text):

+ if (not text.startswith('!expand')) and (': ' in text):

tgt = ':' if field == 'option' else '- '

text = text.replace(': ', tgt)

lines = text.splitlines()

if len(lines) == 1 and field != 'help':

return text

else:

- return '>\n ' + '\n '.join([indent + i.lstrip() for i in lines])

+ return '>\n ' + '\n '.join(

+ [indent + i.lstrip() for i in lines])



def reformat_pages(self, val):

# Convert XXX:YYY into XXX::YYY format for page definition

@@ -355,14 +116,16 @@ class CFspDsc2Yaml():
cfg['page'] = self.reformat_pages(cfg['page'])



if 'struct' in cfg:

- cfg['value'] = self.reformat_struct_value(cfg['struct'], cfg['value'])

+ cfg['value'] = self.reformat_struct_value(

+ cfg['struct'], cfg['value'])



def parse_dsc_line(self, dsc_line, config_dict, init_dict, include):

"""

Parse a line in DSC and update the config dictionary accordingly.

"""

init_dict.clear()

- match = re.match(r'g(CfgData|\w+FspPkgTokenSpaceGuid)\.(.+)', dsc_line)

+ match = re.match('g(CfgData|\\w+FspPkgTokenSpaceGuid)\\.(.+)',

+ dsc_line)

if match:

match = self.cfg_reg_exp.match(match.group(2))

if not match:

@@ -385,7 +148,7 @@ class CFspDsc2Yaml():
self.offset = offset + int(length, 0)

return True



- match = re.match(r"^\s*#\s+!([<>])\s+include\s+(.+)", dsc_line)

+ match = re.match("^\\s*#\\s+!([<>])\\s+include\\s+(.+)", dsc_line)

if match and len(config_dict) == 0:

# !include should not be inside a config field

# if so, do not convert include into YAML

@@ -398,7 +161,7 @@ class CFspDsc2Yaml():
config_dict['include'] = ''

return True



- match = re.match(r"^\s*#\s+(!BSF|!HDR)\s+(.+)", dsc_line)

+ match = re.match("^\\s*#\\s+(!BSF|!HDR)\\s+(.+)", dsc_line)

if not match:

return False



@@ -434,16 +197,19 @@ class CFspDsc2Yaml():
tmp_name = parts[0][:-5]

if tmp_name == 'CFGHDR':

cfg_tag = '_$FFF_'

- sval = '!expand { %s_TMPL : [ ' % tmp_name + '%s, %s, ' % (parts[1],
cfg_tag) \

- + ', '.join(parts[2:]) + ' ] }'

+ sval = '!expand { %s_TMPL : [ ' % \

+ tmp_name + '%s, %s, ' % (parts[1], cfg_tag) + \

+ ', '.join(parts[2:]) + ' ] }'

else:

- sval = '!expand { %s_TMPL : [ ' % tmp_name + ', '.join(parts[1:]) +
' ] }'

+ sval = '!expand { %s_TMPL : [ ' % \

+ tmp_name + ', '.join(parts[1:]) + ' ] }'

config_dict.clear()

config_dict['cname'] = tmp_name

config_dict['expand'] = sval

return True

else:

- if key in ['name', 'help', 'option'] and val.startswith('+'):

+ if key in ['name', 'help', 'option'] and \

+ val.startswith('+'):

val = config_dict[key] + '\n' + val[1:]

if val.strip() == '':

val = "''"

@@ -493,21 +259,23 @@ class CFspDsc2Yaml():
include_file = ['.']



for line in lines:

- match = re.match(r"^\s*#\s+!([<>])\s+include\s+(.+)", line)

+ match = re.match("^\\s*#\\s+!([<>])\\s+include\\s+(.+)", line)

if match:

if match.group(1) == '<':

include_file.append(match.group(2))

else:

include_file.pop()



- match = re.match(r"^\s*#\s+(!BSF)\s+DEFT:{(.+?):(START|END)}", line)

+ match = re.match(

+ "^\\s*#\\s+(!BSF)\\s+DEFT:{(.+?):(START|END)}", line)

if match:

if match.group(3) == 'START' and not template_name:

template_name = match.group(2).strip()

temp_file_dict[template_name] = list(include_file)

bsf_temp_dict[template_name] = []

- if match.group(3) == 'END' and (template_name ==
match.group(2).strip()) \

- and template_name:

+ if match.group(3) == 'END' and \

+ (template_name == match.group(2).strip()) and \

+ template_name:

template_name = ''

else:

if template_name:

@@ -531,12 +299,14 @@ class CFspDsc2Yaml():
init_dict.clear()

padding_dict = {}

cfgs.append(padding_dict)

- padding_dict['cname'] = 'UnusedUpdSpace%d' % self.unused_idx

+ padding_dict['cname'] = 'UnusedUpdSpace%d' % \

+ self.unused_idx

padding_dict['length'] = '0x%x' % num

padding_dict['value'] = '{ 0 }'

self.unused_idx += 1



- if cfgs and cfgs[-1]['cname'][0] != '@' and config_dict['cname'][0] ==
'@':

+ if cfgs and cfgs[-1]['cname'][0] != '@' and \

+ config_dict['cname'][0] == '@':

# it is a bit field, mark the previous one as virtual

cname = cfgs[-1]['cname']

new_cfg = dict(cfgs[-1])

@@ -545,7 +315,8 @@ class CFspDsc2Yaml():
cfgs[-1]['cname'] = cname

cfgs.append(new_cfg)



- if cfgs and cfgs[-1]['cname'] == 'CFGHDR' and config_dict['cname'][0]
== '<':

+ if cfgs and cfgs[-1]['cname'] == 'CFGHDR' and \

+ config_dict['cname'][0] == '<':

# swap CfgHeader and the CFG_DATA order

if ':' in config_dict['cname']:

# replace the real TAG for CFG_DATA

@@ -661,7 +432,7 @@ class CFspDsc2Yaml():
lines = []

for each in self.gen_cfg_data._MacroDict:

key, value = self.variable_fixup(each)

- lines.append('%-30s : %s' % (key, value))

+ lines.append('%-30s : %s' % (key, value))

return lines



def output_template(self):

@@ -671,7 +442,8 @@ class CFspDsc2Yaml():
self.offset = 0

self.base_offset = 0

start, end = self.get_section_range('PcdsDynamicVpd.Tmp')

- bsf_temp_dict, temp_file_dict =
self.process_template_lines(self.gen_cfg_data._DscLines[start:end])

+ bsf_temp_dict, temp_file_dict = self.process_template_lines(

+ self.gen_cfg_data._DscLines[start:end])

template_dict = dict()

lines = []

file_lines = {}

@@ -679,15 +451,18 @@ class CFspDsc2Yaml():
file_lines[last_file] = []



for tmp_name in temp_file_dict:

- temp_file_dict[tmp_name][-1] =
self.normalize_file_name(temp_file_dict[tmp_name][-1], True)

+ temp_file_dict[tmp_name][-1] = self.normalize_file_name(

+ temp_file_dict[tmp_name][-1], True)

if len(temp_file_dict[tmp_name]) > 1:

- temp_file_dict[tmp_name][-2] =
self.normalize_file_name(temp_file_dict[tmp_name][-2], True)

+ temp_file_dict[tmp_name][-2] = self.normalize_file_name(

+ temp_file_dict[tmp_name][-2], True)



for tmp_name in bsf_temp_dict:

file = temp_file_dict[tmp_name][-1]

if last_file != file and len(temp_file_dict[tmp_name]) > 1:

inc_file = temp_file_dict[tmp_name][-2]

- file_lines[inc_file].extend(['', '- !include %s' %
temp_file_dict[tmp_name][-1], ''])

+ file_lines[inc_file].extend(

+ ['', '- !include %s' % temp_file_dict[tmp_name][-1], ''])

last_file = file

if file not in file_lines:

file_lines[file] = []

@@ -708,7 +483,8 @@ class CFspDsc2Yaml():
self.offset = 0

self.base_offset = 0

start, end = self.get_section_range('PcdsDynamicVpd.Upd')

- cfgs = self.process_option_lines(self.gen_cfg_data._DscLines[start:end])

+ cfgs = self.process_option_lines(

+ self.gen_cfg_data._DscLines[start:end])

self.config_fixup(cfgs)

file_lines = self.output_dict(cfgs, True)

return file_lines

@@ -721,13 +497,17 @@ class CFspDsc2Yaml():
level = 0

file = '.'

for each in cfgs:

- if 'length' in each and int(each['length'], 0) == 0:

- continue

+ if 'length' in each:

+ if not each['length'].endswith('b') and int(each['length'],

+ 0) == 0:

+ continue



if 'include' in each:

if each['include']:

- each['include'] = self.normalize_file_name(each['include'])

- file_lines[file].extend(['', '- !include %s' % each['include'], ''])

+ each['include'] = self.normalize_file_name(

+ each['include'])

+ file_lines[file].extend(

+ ['', '- !include %s' % each['include'], ''])

file = each['include']

else:

file = '.'

@@ -766,7 +546,8 @@ class CFspDsc2Yaml():
for field in each:

if field in ['cname', 'expand', 'include']:

continue

- value_str = self.format_value(field, each[field], padding + ' ' * 16)

+ value_str = self.format_value(

+ field, each[field], padding + ' ' * 16)

full_line = ' %s %-12s : %s' % (padding, field, value_str)

lines.extend(full_line.splitlines())



@@ -802,11 +583,13 @@ def dsc_to_yaml(dsc_file, yaml_file):
if file == '.':

cfgs[cfg] = lines

else:

- if('/' in file or '\\' in file):

+ if ('/' in file or '\\' in file):

continue

file = os.path.basename(file)

- fo = open(os.path.join(file), 'w')

- fo.write(__copyright_tmp__ % (cfg, date.today().year) + '\n\n')

+ out_dir = os.path.dirname(file)

+ fo = open(os.path.join(out_dir, file), 'w')

+ fo.write(__copyright_tmp__ % (

+ cfg, date.today().year) + '\n\n')

for line in lines:

fo.write(line + '\n')

fo.close()

@@ -821,13 +604,11 @@ def dsc_to_yaml(dsc_file, yaml_file):


fo.write('\n\ntemplate:\n')

for line in cfgs['Template']:

- if line != '':

- fo.write(' ' + line + '\n')

+ fo.write(' ' + line + '\n')



fo.write('\n\nconfigs:\n')

for line in cfgs['Option']:

- if line != '':

- fo.write(' ' + line + '\n')

+ fo.write(' ' + line + '\n')



fo.close()



@@ -864,7 +645,8 @@ def main():
bsf_file = sys.argv[1]

yaml_file = sys.argv[2]

if os.path.isdir(yaml_file):

- yaml_file = os.path.join(yaml_file, get_fsp_name_from_path(bsf_file) +
'.yaml')

+ yaml_file = os.path.join(

+ yaml_file, get_fsp_name_from_path(bsf_file) + '.yaml')



if bsf_file.endswith('.dsc'):

dsc_file = bsf_file

diff --git a/IntelFsp2Pkg/Tools/FspGenCfgData.py
b/IntelFsp2Pkg/Tools/FspGenCfgData.py
new file mode 100644
index 0000000000..8d4e49c8d2
--- /dev/null
+++ b/IntelFsp2Pkg/Tools/FspGenCfgData.py
@@ -0,0 +1,2637 @@
+# @ GenCfgData.py

+#

+# Copyright (c) 2014 - 2021, Intel Corporation. All rights reserved.<BR>

+# SPDX-License-Identifier: BSD-2-Clause-Patent

+#

+##

+

+import os

+import re

+import sys

+import marshal

+from functools import reduce

+from datetime import date

+

+# Generated file copyright header

+

+__copyright_tmp__ = """/** @file

+

+ Configuration %s File.

+

+ Copyright (c) %4d, Intel Corporation. All rights reserved.<BR>

+ SPDX-License-Identifier: BSD-2-Clause-Patent

+

+ This file is automatically generated. Please do NOT modify !!!

+

+**/

+"""

+

+__copyright_dsc__ = """## @file

+#

+# Copyright (c) %04d, Intel Corporation. All rights reserved.<BR>

+# SPDX-License-Identifier: BSD-2-Clause-Patent

+#

+##

+

+[PcdsDynamicVpd.Upd]

+ #

+ # Global definitions in BSF

+ # !BSF BLOCK:{NAME:"FSP UPD Configuration", VER:"0.1"}

+ #

+

+"""

+

+

+def Bytes2Val(Bytes):

+ return reduce(lambda x, y: (x << 8) | y, Bytes[::-1])

+

+

+def Bytes2Str(Bytes):

+ return '{ %s }' % (', '.join('0x%02X' % i for i in Bytes))

+

+

+def Str2Bytes(Value, Blen):

+ Result = bytearray(Value[1:-1], 'utf-8') # Excluding quotes

+ if len(Result) < Blen:

+ Result.extend(b'\x00' * (Blen - len(Result)))

+ return Result

+

+

+def Val2Bytes(Value, Blen):

+ return [(Value >> (i * 8) & 0xff) for i in range(Blen)]

+

+

+def Array2Val(ValStr):

+ ValStr = ValStr.strip()

+ if ValStr.startswith('{'):

+ ValStr = ValStr[1:]

+ if ValStr.endswith('}'):

+ ValStr = ValStr[:-1]

+ if ValStr.startswith("'"):

+ ValStr = ValStr[1:]

+ if ValStr.endswith("'"):

+ ValStr = ValStr[:-1]

+ Value = 0

+ for Each in ValStr.split(',')[::-1]:

+ Each = Each.strip()

+ if Each.startswith('0x'):

+ Base = 16

+ else:

+ Base = 10

+ Value = (Value << 8) | int(Each, Base)

+ return Value

+

+

+def GetCopyrightHeader(FileType, AllowModify=False):

+ FileDescription = {

+ 'bsf': 'Boot Setting',

+ 'dsc': 'Definition',

+ 'dlt': 'Delta',

+ 'inc': 'C Binary Blob',

+ 'h': 'C Struct Header'

+ }

+ if FileType in ['bsf', 'dsc', 'dlt']:

+ CommentChar = '#'

+ else:

+ CommentChar = ''

+ Lines = __copyright_tmp__.split('\n')

+

+ if AllowModify:

+ Lines = [Line for Line in Lines if 'Please do NOT modify' not in Line]

+

+ CopyrightHdr = '\n'.join('%s%s' % (

+ CommentChar, Line) for Line in Lines)[:-1] + '\n'

+

+ return CopyrightHdr % (FileDescription[FileType], date.today().year)

+

+

+class CLogicalExpression:

+ def __init__(self):

+ self.index = 0

+ self.string = ''

+

+ def errExit(self, err=''):

+ print("ERROR: Express parsing for:")

+ print(" %s" % self.string)

+ print(" %s^" % (' ' * self.index))

+ if err:

+ print("INFO : %s" % err)

+ raise SystemExit

+

+ def getNonNumber(self, n1, n2):

+ if not n1.isdigit():

+ return n1

+ if not n2.isdigit():

+ return n2

+ return None

+

+ def getCurr(self, lens=1):

+ try:

+ if lens == -1:

+ return self.string[self.index:]

+ else:

+ if self.index + lens > len(self.string):

+ lens = len(self.string) - self.index

+ return self.string[self.index: self.index + lens]

+ except Exception:

+ return ''

+

+ def isLast(self):

+ return self.index == len(self.string)

+

+ def moveNext(self, len=1):

+ self.index += len

+

+ def skipSpace(self):

+ while not self.isLast():

+ if self.getCurr() in ' \t':

+ self.moveNext()

+ else:

+ return

+

+ def normNumber(self, val):

+ return True if val else False

+

+ def getNumber(self, var):

+ var = var.strip()

+ if re.match('^0x[a-fA-F0-9]+$', var):

+ value = int(var, 16)

+ elif re.match('^[+-]?\\d+$', var):

+ value = int(var, 10)

+ else:

+ value = None

+ return value

+

+ def parseValue(self):

+ self.skipSpace()

+ var = ''

+ while not self.isLast():

+ char = self.getCurr()

+ if re.match('^[\\w.]', char):

+ var += char

+ self.moveNext()

+ else:

+ break

+ val = self.getNumber(var)

+ if val is None:

+ value = var

+ else:

+ value = "%d" % val

+ return value

+

+ def parseSingleOp(self):

+ self.skipSpace()

+ if re.match('^NOT\\W', self.getCurr(-1)):

+ self.moveNext(3)

+ op = self.parseBrace()

+ val = self.getNumber(op)

+ if val is None:

+ self.errExit("'%s' is not a number" % op)

+ return "%d" % (not self.normNumber(int(op)))

+ else:

+ return self.parseValue()

+

+ def parseBrace(self):

+ self.skipSpace()

+ char = self.getCurr()

+ if char == '(':

+ self.moveNext()

+ value = self.parseExpr()

+ self.skipSpace()

+ if self.getCurr() != ')':

+ self.errExit("Expecting closing brace or operator")

+ self.moveNext()

+ return value

+ else:

+ value = self.parseSingleOp()

+ return value

+

+ def parseCompare(self):

+ value = self.parseBrace()

+ while True:

+ self.skipSpace()

+ char = self.getCurr()

+ if char in ['<', '>']:

+ self.moveNext()

+ next = self.getCurr()

+ if next == '=':

+ op = char + next

+ self.moveNext()

+ else:

+ op = char

+ result = self.parseBrace()

+ test = self.getNonNumber(result, value)

+ if test is None:

+ value = "%d" % self.normNumber(eval(value + op + result))

+ else:

+ self.errExit("'%s' is not a valid number for comparision"

+ % test)

+ elif char in ['=', '!']:

+ op = self.getCurr(2)

+ if op in ['==', '!=']:

+ self.moveNext(2)

+ result = self.parseBrace()

+ test = self.getNonNumber(result, value)

+ if test is None:

+ value = "%d" % self.normNumber((eval(value + op

+ + result)))

+ else:

+ value = "%d" % self.normNumber(eval("'" + value +

+ "'" + op + "'" +

+ result + "'"))

+ else:

+ break

+ else:

+ break

+ return value

+

+ def parseAnd(self):

+ value = self.parseCompare()

+ while True:

+ self.skipSpace()

+ if re.match('^AND\\W', self.getCurr(-1)):

+ self.moveNext(3)

+ result = self.parseCompare()

+ test = self.getNonNumber(result, value)

+ if test is None:

+ value = "%d" % self.normNumber(int(value) & int(result))

+ else:

+ self.errExit("'%s' is not a valid op number for AND" %

+ test)

+ else:

+ break

+ return value

+

+ def parseOrXor(self):

+ value = self.parseAnd()

+ op = None

+ while True:

+ self.skipSpace()

+ op = None

+ if re.match('^XOR\\W', self.getCurr(-1)):

+ self.moveNext(3)

+ op = '^'

+ elif re.match('^OR\\W', self.getCurr(-1)):

+ self.moveNext(2)

+ op = '|'

+ else:

+ break

+ if op:

+ result = self.parseAnd()

+ test = self.getNonNumber(result, value)

+ if test is None:

+ value = "%d" % self.normNumber(eval(value + op + result))

+ else:

+ self.errExit("'%s' is not a valid op number for XOR/OR" %

+ test)

+ return value

+

+ def parseExpr(self):

+ return self.parseOrXor()

+

+ def getResult(self):

+ value = self.parseExpr()

+ self.skipSpace()

+ if not self.isLast():

+ self.errExit("Unexpected character found '%s'" % self.getCurr())

+ test = self.getNumber(value)

+ if test is None:

+ self.errExit("Result '%s' is not a number" % value)

+ return int(value)

+

+ def evaluateExpress(self, Expr):

+ self.index = 0

+ self.string = Expr

+ if self.getResult():

+ Result = True

+ else:

+ Result = False

+ return Result

+

+

+class CFspBsf2Dsc:

+

+ def __init__(self, bsf_file):

+ self.cfg_list = CFspBsf2Dsc.parse_bsf(bsf_file)

+

+ def get_dsc_lines(self):

+ return CFspBsf2Dsc.generate_dsc(self.cfg_list)

+

+ def save_dsc(self, dsc_file):

+ return CFspBsf2Dsc.generate_dsc(self.cfg_list, dsc_file)

+

+ @staticmethod

+ def parse_bsf(bsf_file):

+

+ fd = open(bsf_file, 'r')

+ bsf_txt = fd.read()

+ fd.close()

+

+ find_list = []

+ regex = re.compile(r'\s+Find\s+"(.*?)"(.*?)^\s+(\$(.*?)|Skip)\s+',

+ re.S | re.MULTILINE)

+ for match in regex.finditer(bsf_txt):

+ find = match.group(1)

+ name = match.group(3)

+ line = bsf_txt[:match.end()].count("\n")

+ find_list.append((name, find, line))

+

+ idx = 0

+ count = 0

+ prefix = ''

+ chk_dict = {}

+ cfg_list = []

+ cfg_temp = {'find': '', 'cname': '', 'length': 0, 'value': '0',

+ 'type': 'Reserved', 'isbit': False,

+ 'embed': '', 'page': '', 'option': '', 'instance': 0}

+ regex = re.compile(

+ r'^\s+(\$(.*?)|Skip)\s+(\d+)\s+(bits|bytes)(\s+\$_DEFAULT_\s'

+ r'+=\s+(.+?))?$', re.S |

+ re.MULTILINE)

+

+ for match in regex.finditer(bsf_txt):

+ dlen = int(match.group(3))

+ if match.group(1) == 'Skip':

+ key = 'gPlatformFspPkgTokenSpaceGuid_BsfSkip%d' % idx

+ val = ', '.join(['%02X' % ord(i) for i in '\x00' * dlen])

+ idx += 1

+ option = '$SKIP'

+ else:

+ key = match.group(2)

+ val = match.group(6)

+ option = ''

+ is_bit = True if match.group(4) == 'bits' else False

+

+ cfg_item = dict(cfg_temp)

+ line = bsf_txt[:match.end()].count("\n")

+ finds = [i for i in find_list if line >= i[2]]

+ if len(finds) > 0:

+ prefix = finds[0][1]

+ cfg_item['embed'] = '%s:TAG_%03X:START' % \

+ (prefix, ord(prefix[-1]))

+ cfg_item['find'] = prefix

+ cfg_item['cname'] = 'Signature'

+ cfg_item['length'] = len(finds[0][1])

+ str2byte = Str2Bytes("'" + finds[0][1] + "'",

+ len(finds[0][1]))

+ cfg_item['value'] = '0x%X' % Bytes2Val(str2byte)

+

+ cfg_list.append(dict(cfg_item))

+ cfg_item = dict(cfg_temp)

+ find_list.pop(0)

+ count = 0

+

+ cfg_item['cname'] = key

+ cfg_item['length'] = dlen

+ cfg_item['value'] = val

+ cfg_item['option'] = option

+ cfg_item['isbit'] = is_bit

+

+ if key not in chk_dict.keys():

+ chk_dict[key] = 0

+ else:

+ chk_dict[key] += 1

+ cfg_item['instance'] = chk_dict[key]

+

+ cfg_list.append(cfg_item)

+ count += 1

+

+ if prefix:

+ cfg_item = dict(cfg_temp)

+ cfg_item['cname'] = 'Dummy'

+ cfg_item['embed'] = '%s:%03X:END' % (prefix, ord(prefix[-1]))

+ cfg_list.append(cfg_item)

+

+ option_dict = {}

+ selreg = re.compile(

+ r'\s+Selection\s*(.+?)\s*,\s*"(.*?)"$', re.S |

+ re.MULTILINE)

+ regex = re.compile(

+ r'^List\s&(.+?)$(.+?)^EndList$', re.S | re.MULTILINE)

+ for match in regex.finditer(bsf_txt):

+ key = match.group(1)

+ option_dict[key] = []

+ for select in selreg.finditer(match.group(2)):

+ option_dict[key].append(

+ (int(select.group(1), 0), select.group(2)))

+

+ chk_dict = {}

+ pagereg = re.compile(

+ r'^Page\s"(.*?)"$(.+?)^EndPage$', re.S | re.MULTILINE)

+ for match in pagereg.finditer(bsf_txt):

+ page = match.group(1)

+ for line in match.group(2).splitlines():

+ match = re.match(

+ r'\s+(Combo|EditNum)\s\$(.+?),\s"(.*?)",\s(.+?),$', line)

+ if match:

+ cname = match.group(2)

+ if cname not in chk_dict.keys():

+ chk_dict[cname] = 0

+ else:

+ chk_dict[cname] += 1

+ instance = chk_dict[cname]

+ cfg_idxs = [i for i, j in enumerate(cfg_list)

+ if j['cname'] == cname and

+ j['instance'] == instance]

+ if len(cfg_idxs) != 1:

+ raise Exception(

+ "Multiple CFG item '%s' found !" % cname)

+ cfg_item = cfg_list[cfg_idxs[0]]

+ cfg_item['page'] = page

+ cfg_item['type'] = match.group(1)

+ cfg_item['prompt'] = match.group(3)

+ cfg_item['range'] = None

+ if cfg_item['type'] == 'Combo':

+ cfg_item['option'] = option_dict[match.group(4)[1:]]

+ elif cfg_item['type'] == 'EditNum':

+ cfg_item['option'] = match.group(4)

+ match = re.match(r'\s+ Help\s"(.*?)"$', line)

+ if match:

+ cfg_item['help'] = match.group(1)

+

+ match = re.match(r'\s+"Valid\srange:\s(.*)"$', line)

+ if match:

+ parts = match.group(1).split()

+ cfg_item['option'] = (

+ (int(parts[0], 0), int(parts[2], 0),

+ cfg_item['option']))

+

+ return cfg_list

+

+ @staticmethod

+ def generate_dsc(option_list, dsc_file=None):

+ dsc_lines = []

+ header = '%s' % (__copyright_dsc__ % date.today().year)

+ dsc_lines.extend(header.splitlines())

+

+ pages = []

+ for cfg_item in option_list:

+ if cfg_item['page'] and (cfg_item['page'] not in pages):

+ pages.append(cfg_item['page'])

+

+ page_id = 0

+ for page in pages:

+ dsc_lines.append(' # !BSF PAGES:{PG%02X::"%s"}' % (page_id, page))

+ page_id += 1

+ dsc_lines.append('')

+

+ last_page = ''

+

+ is_bit = False

+ dlen = 0

+ dval = 0

+ bit_fields = []

+ for idx, option in enumerate(option_list):

+ if not is_bit and option['isbit']:

+ is_bit = True

+ dlen = 0

+ dval = 0

+ idxs = idx

+ if is_bit and not option['isbit']:

+ is_bit = False

+ if dlen % 8 != 0:

+ raise Exception("Bit fields are not aligned at "

+ "byte boundary !")

+ bit_fields.append((idxs, idx, dlen, dval))

+ if is_bit:

+ blen = option['length']

+ bval = int(option['value'], 0)

+ dval = dval + ((bval & ((1 << blen) - 1)) << dlen)

+ print(dlen, blen, bval, hex(dval))

+ dlen += blen

+

+ struct_idx = 0

+ for idx, option in enumerate(option_list):

+ dsc_lines.append('')

+ default = option['value']

+ pos = option['cname'].find('_')

+ name = option['cname'][pos + 1:]

+

+ for start_idx, end_idx, bits_len, bits_val in bit_fields:

+ if idx == start_idx:

+ val_str = Bytes2Str(Val2Bytes(bits_val, bits_len // 8))

+ dsc_lines.append(' # !HDR STRUCT:{BIT_FIELD_DATA_%d}'

+ % struct_idx)

+ dsc_lines.append(' # !BSF NAME:{BIT_FIELD_STRUCT}')

+ dsc_lines.append(' gCfgData.BitFiledStruct%d '

+ ' | * | 0x%04X | %s' %

+ (struct_idx, bits_len // 8, val_str))

+ dsc_lines.append('')

+ struct_idx += 1

+

+ if option['find']:

+ dsc_lines.append(' # !BSF FIND:{%s}' % option['find'])

+ dsc_lines.append('')

+

+ if option['instance'] > 0:

+ name = name + '_%s' % option['instance']

+

+ if option['embed']:

+ dsc_lines.append(' # !HDR EMBED:{%s}' % option['embed'])

+

+ if option['type'] == 'Reserved':

+ dsc_lines.append(' # !BSF NAME:{Reserved} TYPE:{Reserved}')

+ if option['option'] == '$SKIP':

+ dsc_lines.append(' # !BSF OPTION:{$SKIP}')

+ else:

+ prompt = option['prompt']

+

+ if last_page != option['page']:

+ last_page = option['page']

+ dsc_lines.append(' # !BSF PAGE:{PG%02X}' %

+ (pages.index(option['page'])))

+

+ if option['type'] == 'Combo':

+ dsc_lines.append(' # !BSF NAME:{%s} TYPE:{%s}' %

+ (prompt, option['type']))

+ ops = []

+ for val, text in option['option']:

+ ops.append('0x%x:%s' % (val, text))

+ dsc_lines.append(' # !BSF OPTION:{%s}' % (', '.join(ops)))

+ elif option['type'] == 'EditNum':

+ cfg_len = option['length']

+ if ',' in default and cfg_len > 8:

+ dsc_lines.append(' # !BSF NAME:{%s} TYPE:{Table}' %

+ (prompt))

+ if cfg_len > 16:

+ cfg_len = 16

+ ops = []

+ for i in range(cfg_len):

+ ops.append('%X:1:HEX' % i)

+ dsc_lines.append(' # !BSF OPTION:{%s}' %

+ (', '.join(ops)))

+ else:

+ dsc_lines.append(

+ ' # !BSF NAME:{%s} TYPE:{%s, %s, (0x%X, 0x%X)}' %

+ (prompt, option['type'], option['option'][2],

+ option['option'][0], option['option'][1]))

+ dsc_lines.append(' # !BSF HELP:{%s}' % option['help'])

+

+ if ',' in default:

+ default = '{%s}' % default

+

+ if option['isbit']:

+ dsc_lines.append(' # !BSF FIELD:{%s:%db}'

+ % (name, option['length']))

+ else:

+ dsc_lines.append(' gCfgData.%-30s | * | 0x%04X | %s' %

+ (name, option['length'], default))

+

+ if dsc_file:

+ fd = open(dsc_file, 'w')

+ fd.write('\n'.join(dsc_lines))

+ fd.close()

+

+ return dsc_lines

+

+

+class CGenCfgData:

+ def __init__(self, Mode=''):

+ self.Debug = False

+ self.Error = ''

+ self.ReleaseMode = True

+ self.Mode = Mode

+ self._GlobalDataDef = """

+GlobalDataDef

+ SKUID = 0, "DEFAULT"

+EndGlobalData

+

+"""

+ self._BuidinOptionTxt = """

+List &EN_DIS

+ Selection 0x1 , "Enabled"

+ Selection 0x0 , "Disabled"

+EndList

+

+"""

+ self._StructType = ['UINT8', 'UINT16', 'UINT32', 'UINT64']

+ self._BsfKeyList = ['FIND', 'NAME', 'HELP', 'TYPE', 'PAGE', 'PAGES',

+ 'BLOCK', 'OPTION', 'CONDITION', 'ORDER', 'MARKER',

+ 'SUBT']

+ self._HdrKeyList = ['HEADER', 'STRUCT', 'EMBED', 'COMMENT']

+ self._BuidinOption = {'$EN_DIS': 'EN_DIS'}

+

+ self._MacroDict = {}

+ self._VarDict = {}

+ self._PcdsDict = {}

+ self._CfgBlkDict = {}

+ self._CfgPageDict = {}

+ self._CfgOptsDict = {}

+ self._BsfTempDict = {}

+ self._CfgItemList = []

+ self._DscLines = []

+ self._DscFile = ''

+ self._CfgPageTree = {}

+

+ self._MapVer = 0

+ self._MinCfgTagId = 0x100

+

+ def ParseMacros(self, MacroDefStr):

+ # ['-DABC=1', '-D', 'CFG_DEBUG=1', '-D', 'CFG_OUTDIR=Build']

+ self._MacroDict = {}

+ IsExpression = False

+ for Macro in MacroDefStr:

+ if Macro.startswith('-D'):

+ IsExpression = True

+ if len(Macro) > 2:

+ Macro = Macro[2:]

+ else:

+ continue

+ if IsExpression:

+ IsExpression = False

+ Match = re.match("(\\w+)=(.+)", Macro)

+ if Match:

+ self._MacroDict[Match.group(1)] = Match.group(2)

+ else:

+ Match = re.match("(\\w+)", Macro)

+ if Match:

+ self._MacroDict[Match.group(1)] = ''

+ if len(self._MacroDict) == 0:

+ Error = 1

+ else:

+ Error = 0

+ if self.Debug:

+ print("INFO : Macro dictionary:")

+ for Each in self._MacroDict:

+ print(" $(%s) = [ %s ]" % (Each,

+ self._MacroDict[Each]))

+ return Error

+

+ def EvaulateIfdef(self, Macro):

+ Result = Macro in self._MacroDict

+ if self.Debug:

+ print("INFO : Eval Ifdef [%s] : %s" % (Macro, Result))

+ return Result

+

+ def ExpandMacros(self, Input, Preserve=False):

+ Line = Input

+ Match = re.findall("\\$\\(\\w+\\)", Input)

+ if Match:

+ for Each in Match:

+ Variable = Each[2:-1]

+ if Variable in self._MacroDict:

+ Line = Line.replace(Each, self._MacroDict[Variable])

+ else:

+ if self.Debug:

+ print("WARN : %s is not defined" % Each)

+ if not Preserve:

+ Line = Line.replace(Each, Each[2:-1])

+ return Line

+

+ def ExpandPcds(self, Input):

+ Line = Input

+ Match = re.findall("(\\w+\\.\\w+)", Input)

+ if Match:

+ for PcdName in Match:

+ if PcdName in self._PcdsDict:

+ Line = Line.replace(PcdName, self._PcdsDict[PcdName])

+ else:

+ if self.Debug:

+ print("WARN : %s is not defined" % PcdName)

+ return Line

+

+ def EvaluateExpress(self, Expr):

+ ExpExpr = self.ExpandPcds(Expr)

+ ExpExpr = self.ExpandMacros(ExpExpr)

+ LogExpr = CLogicalExpression()

+ Result = LogExpr.evaluateExpress(ExpExpr)

+ if self.Debug:

+ print("INFO : Eval Express [%s] : %s" % (Expr, Result))

+ return Result

+

+ def ValueToByteArray(self, ValueStr, Length):

+ Match = re.match("\\{\\s*FILE:(.+)\\}", ValueStr)

+ if Match:

+ FileList = Match.group(1).split(',')

+ Result = bytearray()

+ for File in FileList:

+ File = File.strip()

+ BinPath = os.path.join(os.path.dirname(self._DscFile), File)

+ Result.extend(bytearray(open(BinPath, 'rb').read()))

+ else:

+ try:

+ Result = bytearray(self.ValueToList(ValueStr, Length))

+ except ValueError:

+ raise Exception("Bytes in '%s' must be in range 0~255 !" %

+ ValueStr)

+ if len(Result) < Length:

+ Result.extend(b'\x00' * (Length - len(Result)))

+ elif len(Result) > Length:

+ raise Exception("Value '%s' is too big to fit into %d bytes !" %

+ (ValueStr, Length))

+

+ return Result[:Length]

+

+ def ValueToList(self, ValueStr, Length):

+ if ValueStr[0] == '{':

+ Result = []

+ BinList = ValueStr[1:-1].split(',')

+ InBitField = False

+ LastInBitField = False

+ Value = 0

+ BitLen = 0

+ for Element in BinList:

+ InBitField = False

+ Each = Element.strip()

+ if len(Each) == 0:

+ pass

+ else:

+ if Each[0] in ['"', "'"]:

+ Result.extend(list(bytearray(Each[1:-1], 'utf-8')))

+ elif ':' in Each:

+ Match = re.match("(.+):(\\d+)b", Each)

+ if Match is None:

+ raise Exception("Invald value list format '%s' !"

+ % Each)

+ InBitField = True

+ CurrentBitLen = int(Match.group(2))

+ CurrentValue = ((self.EvaluateExpress(Match.group(1))

+ & (1 << CurrentBitLen) - 1)) << BitLen

+ else:

+ Result.append(self.EvaluateExpress(Each.strip()))

+ if InBitField:

+ Value += CurrentValue

+ BitLen += CurrentBitLen

+ if LastInBitField and ((not InBitField) or (Element ==

+ BinList[-1])):

+ if BitLen % 8 != 0:

+ raise Exception("Invald bit field length!")

+ Result.extend(Val2Bytes(Value, BitLen // 8))

+ Value = 0

+ BitLen = 0

+ LastInBitField = InBitField

+ elif ValueStr.startswith("'") and ValueStr.endswith("'"):

+ Result = Str2Bytes(ValueStr, Length)

+ elif ValueStr.startswith('"') and ValueStr.endswith('"'):

+ Result = Str2Bytes(ValueStr, Length)

+ else:

+ Result = Val2Bytes(self.EvaluateExpress(ValueStr), Length)

+ return Result

+

+ def FormatDeltaValue(self, ConfigDict):

+ ValStr = ConfigDict['value']

+ if ValStr[0] == "'":

+ # Remove padding \x00 in the value string

+ ValStr = "'%s'" % ValStr[1:-1].rstrip('\x00')

+

+ Struct = ConfigDict['struct']

+ if Struct in self._StructType:

+ # Format the array using its struct type

+ Unit = int(Struct[4:]) // 8

+ Value = Array2Val(ConfigDict['value'])

+ Loop = ConfigDict['length'] // Unit

+ Values = []

+ for Each in range(Loop):

+ Values.append(Value & ((1 << (Unit * 8)) - 1))

+ Value = Value >> (Unit * 8)

+ ValStr = '{ ' + ', '.join([('0x%%0%dX' % (Unit * 2)) %

+ x for x in Values]) + ' }'

+

+ return ValStr

+

+ def FormatListValue(self, ConfigDict):

+ Struct = ConfigDict['struct']

+ if Struct not in self._StructType:

+ return

+

+ DataList = self.ValueToList(ConfigDict['value'], ConfigDict['length'])

+ Unit = int(Struct[4:]) // 8

+ if int(ConfigDict['length']) != Unit * len(DataList):

+ # Fallback to byte array

+ Unit = 1

+ if int(ConfigDict['length']) != len(DataList):

+ raise Exception("Array size is not proper for '%s' !" %

+ ConfigDict['cname'])

+

+ ByteArray = []

+ for Value in DataList:

+ for Loop in range(Unit):

+ ByteArray.append("0x%02X" % (Value & 0xFF))

+ Value = Value >> 8

+ NewValue = '{' + ','.join(ByteArray) + '}'

+ ConfigDict['value'] = NewValue

+

+ return ""

+

+ def GetOrderNumber(self, Offset, Order, BitOff=0):

+ if isinstance(Order, int):

+ if Order == -1:

+ Order = Offset << 16

+ else:

+ (Major, Minor) = Order.split('.')

+ Order = (int(Major, 16) << 16) + ((int(Minor, 16) & 0xFF) << 8)

+ return Order + (BitOff & 0xFF)

+

+ def SubtituteLine(self, Line, Args):

+ Args = Args.strip()

+ Vars = Args.split(':')

+ Line = self.ExpandMacros(Line, True)

+ for Idx in range(len(Vars)-1, 0, -1):

+ Line = Line.replace('$(%d)' % Idx, Vars[Idx].strip())

+ return Line

+

+ def CfgDuplicationCheck(self, CfgDict, Name):

+ if not self.Debug:

+ return

+

+ if Name == 'Dummy':

+ return

+

+ if Name not in CfgDict:

+ CfgDict[Name] = 1

+ else:

+ print("WARNING: Duplicated item found '%s' !" %

+ CfgDict['cname'])

+

+ def AddBsfChildPage(self, Child, Parent='root'):

+ def AddBsfChildPageRecursive(PageTree, Parent, Child):

+ Key = next(iter(PageTree))

+ if Parent == Key:

+ PageTree[Key].append({Child: []})

+ return True

+ else:

+ Result = False

+ for Each in PageTree[Key]:

+ if AddBsfChildPageRecursive(Each, Parent, Child):

+ Result = True

+ break

+ return Result

+

+ return AddBsfChildPageRecursive(self._CfgPageTree, Parent, Child)

+

+ def ParseDscFile(self, DscFile):

+ self._DscLines = []

+ self._CfgItemList = []

+ self._CfgPageDict = {}

+ self._CfgBlkDict = {}

+ self._BsfTempDict = {}

+ self._CfgPageTree = {'root': []}

+

+ CfgDict = {}

+

+ SectionNameList = ["Defines".lower(), "PcdsFeatureFlag".lower(),

+ "PcdsDynamicVpd.Tmp".lower(),

+ "PcdsDynamicVpd.Upd".lower()]

+

+ IsDefSect = False

+ IsPcdSect = False

+ IsUpdSect = False

+ IsTmpSect = False

+

+ TemplateName = ''

+

+ IfStack = []

+ ElifStack = []

+ Error = 0

+ ConfigDict = {}

+

+ if type(DscFile) is list:

+ # it is DSC lines already

+ DscLines = DscFile

+ self._DscFile = '.'

+ else:

+ DscFd = open(DscFile, "r")

+ DscLines = DscFd.readlines()

+ DscFd.close()

+ self._DscFile = DscFile

+

+ BsfRegExp = re.compile("(%s):{(.+?)}(?:$|\\s+)" % '|'.

+ join(self._BsfKeyList))

+ HdrRegExp = re.compile("(%s):{(.+?)}" % '|'.join(self._HdrKeyList))

+ CfgRegExp = re.compile("^([_a-zA-Z0-9]+)\\s*\\|\\s*\

+(0x[0-9A-F]+|\\*)\\s*\\|\\s*(\\d+|0x[0-9a-fA-F]+)\\s*\\|\\s*(.+)")

+ TksRegExp = re.compile("^(g[_a-zA-Z0-9]+\\.)(.+)")

+ SkipLines = 0

+ while len(DscLines):

+ DscLine = DscLines.pop(0).strip()

+ if SkipLines == 0:

+ self._DscLines.append(DscLine)

+ else:

+ SkipLines = SkipLines - 1

+ if len(DscLine) == 0:

+ continue

+

+ Handle = False

+ Match = re.match("^\\[(.+)\\]", DscLine)

+ if Match is not None:

+ IsDefSect = False

+ IsPcdSect = False

+ IsUpdSect = False

+ IsTmpSect = False

+ SectionName = Match.group(1).lower()

+ if SectionName == SectionNameList[0]:

+ IsDefSect = True

+ if SectionName == SectionNameList[1]:

+ IsPcdSect = True

+ elif SectionName == SectionNameList[2]:

+ IsTmpSect = True

+ elif SectionName == SectionNameList[3]:

+ ConfigDict = {

+ 'header': 'ON',

+ 'page': '',

+ 'name': '',

+ 'find': '',

+ 'struct': '',

+ 'embed': '',

+ 'marker': '',

+ 'option': '',

+ 'comment': '',

+ 'condition': '',

+ 'order': -1,

+ 'subreg': []

+ }

+ IsUpdSect = True

+ Offset = 0

+ else:

+ if IsDefSect or IsPcdSect or IsUpdSect or IsTmpSect:

+ Match = False if DscLine[0] != '!' else True

+ if Match:

+ Match = re.match("^!(else|endif|ifdef|ifndef|if|elseif\

+|include)\\s*(.+)?$", DscLine.split("#")[0])

+ Keyword = Match.group(1) if Match else ''

+ Remaining = Match.group(2) if Match else ''

+ Remaining = '' if Remaining is None else Remaining.strip()

+

+ if Keyword in ['if', 'elseif', 'ifdef', 'ifndef', 'include'

+ ] and not Remaining:

+ raise Exception("ERROR: Expression is expected after \

+'!if' or !elseif' for line '%s'" % DscLine)

+

+ if Keyword == 'else':

+ if IfStack:

+ IfStack[-1] = not IfStack[-1]

+ else:

+ raise Exception("ERROR: No paired '!if' found for \

+'!else' for line '%s'" % DscLine)

+ elif Keyword == 'endif':

+ if IfStack:

+ IfStack.pop()

+ Level = ElifStack.pop()

+ if Level > 0:

+ del IfStack[-Level:]

+ else:

+ raise Exception("ERROR: No paired '!if' found for \

+'!endif' for line '%s'" % DscLine)

+ elif Keyword == 'ifdef' or Keyword == 'ifndef':

+ Result = self.EvaulateIfdef(Remaining)

+ if Keyword == 'ifndef':

+ Result = not Result

+ IfStack.append(Result)

+ ElifStack.append(0)

+ elif Keyword == 'if' or Keyword == 'elseif':

+ Result = self.EvaluateExpress(Remaining)

+ if Keyword == "if":

+ ElifStack.append(0)

+ IfStack.append(Result)

+ else: # elseif

+ if IfStack:

+ IfStack[-1] = not IfStack[-1]

+ IfStack.append(Result)

+ ElifStack[-1] = ElifStack[-1] + 1

+ else:

+ raise Exception("ERROR: No paired '!if' found for \

+'!elif' for line '%s'" % DscLine)

+ else:

+ if IfStack:

+ Handle = reduce(lambda x, y: x and y, IfStack)

+ else:

+ Handle = True

+ if Handle:

+ if Keyword == 'include':

+ Remaining = self.ExpandMacros(Remaining)

+ # Relative to DSC filepath

+ IncludeFilePath = os.path.join(

+ os.path.dirname(self._DscFile), Remaining)

+ if not os.path.exists(IncludeFilePath):

+ # Relative to repository to find \

+ # dsc in common platform

+ IncludeFilePath = os.path.join(

+ os.path.dirname(self._DscFile), "..",

+ Remaining)

+

+ try:

+ IncludeDsc = open(IncludeFilePath, "r")

+ except Exception:

+ raise Exception("ERROR: Cannot open \

+file '%s'." % IncludeFilePath)

+ NewDscLines = IncludeDsc.readlines()

+ IncludeDsc.close()

+ DscLines = NewDscLines + DscLines

+ del self._DscLines[-1]

+ else:

+ if DscLine.startswith('!'):

+ raise Exception("ERROR: Unrecoginized \

+directive for line '%s'" % DscLine)

+

+ if not Handle:

+ del self._DscLines[-1]

+ continue

+

+ if IsDefSect:

+ Match = re.match("^\\s*(?:DEFINE\\s+)*(\\w+)\\s*=\\s*(.+)",

+ DscLine)

+ if Match:

+ self._MacroDict[Match.group(1)] = Match.group(2)

+ if self.Debug:

+ print("INFO : DEFINE %s = [ %s ]" % (Match.group(1),

+ Match.group(2)))

+

+ elif IsPcdSect:

+ Match = re.match("^\\s*([\\w\\.]+)\\s*\\|\\s*(\\w+)", DscLine)

+ if Match:

+ self._PcdsDict[Match.group(1)] = Match.group(2)

+ if self.Debug:

+ print("INFO : PCD %s = [ %s ]" % (Match.group(1),

+ Match.group(2)))

+

+ elif IsTmpSect:

+ # !BSF DEFT:{GPIO_TMPL:START}

+ Match = re.match("^\\s*#\\s+(!BSF)\\s+DEFT:{(.+?):\

+(START|END)}", DscLine)

+ if Match:

+ if Match.group(3) == 'START' and not TemplateName:

+ TemplateName = Match.group(2).strip()

+ self._BsfTempDict[TemplateName] = []

+ if Match.group(3) == 'END' and (

+ TemplateName == Match.group(2).strip()

+ ) and TemplateName:

+ TemplateName = ''

+ else:

+ if TemplateName:

+ Match = re.match("^!include\\s*(.+)?$", DscLine)

+ if Match:

+ continue

+ self._BsfTempDict[TemplateName].append(DscLine)

+

+ else:

+ Match = re.match("^\\s*#\\s+(!BSF|!HDR)\\s+(.+)", DscLine)

+ if Match:

+ Remaining = Match.group(2)

+ if Match.group(1) == '!BSF':

+ Result = BsfRegExp.findall(Remaining)

+ if Result:

+ for Each in Result:

+ Key = Each[0]

+ Remaining = Each[1]

+

+ if Key == 'BLOCK':

+ Match = re.match(

+ "NAME:\"(.+)\"\\s*,\\s*\

+VER:\"(.+)\"\\s*", Remaining)

+ if Match:

+ self._CfgBlkDict['name'] = \

+ Match.group(1)

+ self._CfgBlkDict['ver'] = Match.group(2

+ )

+

+ elif Key == 'SUBT':

+ # GPIO_TMPL:1:2:3

+ Remaining = Remaining.strip()

+ Match = re.match("(\\w+)\\s*:", Remaining)

+ if Match:

+ TemplateName = Match.group(1)

+ for Line in self._BsfTempDict[

+ TemplateName][::-1]:

+ NewLine = self.SubtituteLine(

+ Line, Remaining)

+ DscLines.insert(0, NewLine)

+ SkipLines += 1

+

+ elif Key == 'PAGES':

+ # !BSF PAGES:{HSW:"Haswell System Agent", \

+ # LPT:"Lynx Point PCH"}

+ PageList = Remaining.split(',')

+ for Page in PageList:

+ Page = Page.strip()

+ Match = re.match('(\\w+):\

+(\\w*:)?\\"(.+)\\"', Page)

+ if Match:

+ PageName = Match.group(1)

+ ParentName = Match.group(2)

+ if not ParentName or \

+ ParentName == ':':

+ ParentName = 'root'

+ else:

+ ParentName = ParentName[:-1]

+ if not self.AddBsfChildPage(

+ PageName, ParentName):

+ raise Exception("Cannot find \

+parent page '%s'!" % ParentName)

+ self._CfgPageDict[

+ PageName] = Match.group(3)

+ else:

+ raise Exception("Invalid page \

+definitions '%s'!" % Page)

+

+ elif Key in ['NAME', 'HELP', 'OPTION'

+ ] and Remaining.startswith('+'):

+ # Allow certain options to be extended \

+ # to multiple lines

+ ConfigDict[Key.lower()] += Remaining[1:]

+

+ else:

+ if Key == 'NAME':

+ Remaining = Remaining.strip()

+ elif Key == 'CONDITION':

+ Remaining = self.ExpandMacros(

+ Remaining.strip())

+ ConfigDict[Key.lower()] = Remaining

+ else:

+ Match = HdrRegExp.match(Remaining)

+ if Match:

+ Key = Match.group(1)

+ Remaining = Match.group(2)

+ if Key == 'EMBED':

+ Parts = Remaining.split(':')

+ Names = Parts[0].split(',')

+ DummyDict = ConfigDict.copy()

+ if len(Names) > 1:

+ Remaining = Names[0] + ':' + ':'.join(

+ Parts[1:])

+ DummyDict['struct'] = Names[1]

+ else:

+ DummyDict['struct'] = Names[0]

+ DummyDict['cname'] = 'Dummy'

+ DummyDict['name'] = ''

+ DummyDict['embed'] = Remaining

+ DummyDict['offset'] = Offset

+ DummyDict['length'] = 0

+ DummyDict['value'] = '0'

+ DummyDict['type'] = 'Reserved'

+ DummyDict['help'] = ''

+ DummyDict['subreg'] = []

+ self._CfgItemList.append(DummyDict)

+ else:

+ ConfigDict[Key.lower()] = Remaining

+ # Check CFG line

+ # gCfgData.VariableName | * | 0x01 | 0x1

+ Clear = False

+

+ Match = TksRegExp.match(DscLine)

+ if Match:

+ DscLine = 'gCfgData.%s' % Match.group(2)

+

+ if DscLine.startswith('gCfgData.'):

+ Match = CfgRegExp.match(DscLine[9:])

+ else:

+ Match = None

+ if Match:

+ ConfigDict['space'] = 'gCfgData'

+ ConfigDict['cname'] = Match.group(1)

+ if Match.group(2) != '*':

+ Offset = int(Match.group(2), 16)

+ ConfigDict['offset'] = Offset

+ ConfigDict['order'] = self.GetOrderNumber(

+ ConfigDict['offset'], ConfigDict['order'])

+

+ Value = Match.group(4).strip()

+ if Match.group(3).startswith("0x"):

+ Length = int(Match.group(3), 16)

+ else:

+ Length = int(Match.group(3))

+

+ Offset += Length

+

+ ConfigDict['length'] = Length

+ Match = re.match("\\$\\((\\w+)\\)", Value)

+ if Match:

+ if Match.group(1) in self._MacroDict:

+ Value = self._MacroDict[Match.group(1)]

+

+ ConfigDict['value'] = Value

+ if re.match("\\{\\s*FILE:(.+)\\}", Value):

+ # Expand embedded binary file

+ ValArray = self.ValueToByteArray(ConfigDict['value'],

+ ConfigDict['length'])

+ NewValue = Bytes2Str(ValArray)

+ self._DscLines[-1] = re.sub(r'(.*)(\{\s*FILE:.+\})',

+ r'\1 %s' % NewValue,

+ self._DscLines[-1])

+ ConfigDict['value'] = NewValue

+

+ if ConfigDict['name'] == '':

+ # Clear BSF specific items

+ ConfigDict['bsfname'] = ''

+ ConfigDict['help'] = ''

+ ConfigDict['type'] = ''

+ ConfigDict['option'] = ''

+

+ self.CfgDuplicationCheck(CfgDict, ConfigDict['cname'])

+ self._CfgItemList.append(ConfigDict.copy())

+ Clear = True

+

+ else:

+ # It could be a virtual item as below

+ # !BSF FIELD:{SerialDebugPortAddress0:1}

+ # or

+ # @Bsf FIELD:{SerialDebugPortAddress0:1b}

+ Match = re.match(r"^\s*#\s+(!BSF)\s+FIELD:{(.+)}", DscLine)

+ if Match:

+ BitFieldTxt = Match.group(2)

+ Match = re.match("(.+):(\\d+)b([BWDQ])?", BitFieldTxt)

+ if not Match:

+ raise Exception("Incorrect bit field \

+format '%s' !" % BitFieldTxt)

+ UnitBitLen = 1

+ SubCfgDict = ConfigDict.copy()

+ SubCfgDict['cname'] = Match.group(1)

+ SubCfgDict['bitlength'] = int(

+ Match.group(2)) * UnitBitLen

+ if SubCfgDict['bitlength'] > 0:

+ LastItem = self._CfgItemList[-1]

+ if len(LastItem['subreg']) == 0:

+ SubOffset = 0

+ else:

+ SubOffset = \

+ LastItem['subreg'][-1]['bitoffset'] \

+ + LastItem['subreg'][-1]['bitlength']

+ if Match.group(3) == 'B':

+ SubCfgDict['bitunit'] = 1

+ elif Match.group(3) == 'W':

+ SubCfgDict['bitunit'] = 2

+ elif Match.group(3) == 'Q':

+ SubCfgDict['bitunit'] = 8

+ else:

+ SubCfgDict['bitunit'] = 4

+ SubCfgDict['bitoffset'] = SubOffset

+ SubCfgDict['order'] = self.GetOrderNumber(

+ SubCfgDict['offset'], SubCfgDict['order'],

+ SubOffset)

+ SubCfgDict['value'] = ''

+ SubCfgDict['cname'] = '%s_%s' % (LastItem['cname'],

+ Match.group(1))

+ self.CfgDuplicationCheck(CfgDict,

+ SubCfgDict['cname'])

+ LastItem['subreg'].append(SubCfgDict.copy())

+ Clear = True

+

+ if Clear:

+ ConfigDict['name'] = ''

+ ConfigDict['find'] = ''

+ ConfigDict['struct'] = ''

+ ConfigDict['embed'] = ''

+ ConfigDict['marker'] = ''

+ ConfigDict['comment'] = ''

+ ConfigDict['order'] = -1

+ ConfigDict['subreg'] = []

+ ConfigDict['option'] = ''

+ ConfigDict['condition'] = ''

+

+ return Error

+

+ def GetBsfBitFields(self, subitem, bytes):

+ start = subitem['bitoffset']

+ end = start + subitem['bitlength']

+ bitsvalue = ''.join('{0:08b}'.format(i) for i in bytes[::-1])

+ bitsvalue = bitsvalue[::-1]

+ bitslen = len(bitsvalue)

+ if start > bitslen or end > bitslen:

+ raise Exception("Invalid bits offset [%d,%d] %d for %s" %

+ (start, end, bitslen, subitem['name']))

+ return '0x%X' % (int(bitsvalue[start:end][::-1], 2))

+

+ def UpdateBsfBitFields(self, SubItem, NewValue, ValueArray):

+ Start = SubItem['bitoffset']

+ End = Start + SubItem['bitlength']

+ Blen = len(ValueArray)

+ BitsValue = ''.join('{0:08b}'.format(i) for i in ValueArray[::-1])

+ BitsValue = BitsValue[::-1]

+ BitsLen = len(BitsValue)

+ if Start > BitsLen or End > BitsLen:

+ raise Exception("Invalid bits offset [%d,%d] %d for %s" %

+ (Start, End, BitsLen, SubItem['name']))

+ BitsValue = BitsValue[:Start] + '{0:0{1}b}'.format(

+ NewValue, SubItem['bitlength'])[::-1] + BitsValue[End:]

+ ValueArray[:] = bytearray.fromhex(

+ '{0:0{1}x}'.format(int(BitsValue[::-1], 2), Blen * 2))[::-1]

+

+ def CreateVarDict(self):

+ Error = 0

+ self._VarDict = {}

+ if len(self._CfgItemList) > 0:

+ Item = self._CfgItemList[-1]

+ self._VarDict['_LENGTH_'] = '%d' % (Item['offset'] +

+ Item['length'])

+ for Item in self._CfgItemList:

+ Embed = Item['embed']

+ Match = re.match("^(\\w+):(\\w+):(START|END)", Embed)

+ if Match:

+ StructName = Match.group(1)

+ VarName = '_%s_%s_' % (Match.group(3), StructName)

+ if Match.group(3) == 'END':

+ self._VarDict[VarName] = Item['offset'] + Item['length']

+ self._VarDict['_LENGTH_%s_' % StructName] = \

+ self._VarDict['_END_%s_' % StructName] - \

+ self._VarDict['_START_%s_' % StructName]

+ if Match.group(2).startswith('TAG_'):

+ if (self.Mode != 'FSP') and (self._VarDict

+ ['_LENGTH_%s_' %

+ StructName] % 4):

+ raise Exception("Size of structure '%s' is %d, \

+not DWORD aligned !" % (StructName, self._VarDict['_LENGTH_%s_' %
StructName]))

+ self._VarDict['_TAG_%s_' % StructName] = int(

+ Match.group(2)[4:], 16) & 0xFFF

+ else:

+ self._VarDict[VarName] = Item['offset']

+ if Item['marker']:

+ self._VarDict['_OFFSET_%s_' % Item['marker'].strip()] = \

+ Item['offset']

+ return Error

+

+ def UpdateBsfBitUnit(self, Item):

+ BitTotal = 0

+ BitOffset = 0

+ StartIdx = 0

+ Unit = None

+ UnitDec = {1: 'BYTE', 2: 'WORD', 4: 'DWORD', 8: 'QWORD'}

+ for Idx, SubItem in enumerate(Item['subreg']):

+ if Unit is None:

+ Unit = SubItem['bitunit']

+ BitLength = SubItem['bitlength']

+ BitTotal += BitLength

+ BitOffset += BitLength

+

+ if BitOffset > 64 or BitOffset > Unit * 8:

+ break

+

+ if BitOffset == Unit * 8:

+ for SubIdx in range(StartIdx, Idx + 1):

+ Item['subreg'][SubIdx]['bitunit'] = Unit

+ BitOffset = 0

+ StartIdx = Idx + 1

+ Unit = None

+

+ if BitOffset > 0:

+ raise Exception("Bit fields cannot fit into %s for \

+'%s.%s' !" % (UnitDec[Unit], Item['cname'], SubItem['cname']))

+

+ ExpectedTotal = Item['length'] * 8

+ if Item['length'] * 8 != BitTotal:

+ raise Exception("Bit fields total length (%d) does not match \

+length (%d) of '%s' !" % (BitTotal, ExpectedTotal, Item['cname']))

+

+ def UpdateDefaultValue(self):

+ Error = 0

+ for Idx, Item in enumerate(self._CfgItemList):

+ if len(Item['subreg']) == 0:

+ Value = Item['value']

+ if (len(Value) > 0) and (Value[0] == '{' or Value[0] == "'" or

+ Value[0] == '"'):

+ # {XXX} or 'XXX' strings

+ self.FormatListValue(self._CfgItemList[Idx])

+ else:

+ Match = re.match("(0x[0-9a-fA-F]+|[0-9]+)", Value)

+ if not Match:

+ NumValue = self.EvaluateExpress(Value)

+ Item['value'] = '0x%X' % NumValue

+ else:

+ ValArray = self.ValueToByteArray(Item['value'], Item['length'])

+ for SubItem in Item['subreg']:

+ SubItem['value'] = self.GetBsfBitFields(SubItem, ValArray)

+ self.UpdateBsfBitUnit(Item)

+ return Error

+

+ @staticmethod

+ def ExpandIncludeFiles(FilePath, CurDir=''):

+ if CurDir == '':

+ CurDir = os.path.dirname(FilePath)

+ FilePath = os.path.basename(FilePath)

+

+ InputFilePath = os.path.join(CurDir, FilePath)

+ File = open(InputFilePath, "r")

+ Lines = File.readlines()

+ File.close()

+

+ NewLines = []

+ for LineNum, Line in enumerate(Lines):

+ Match = re.match("^!include\\s*(.+)?$", Line)

+ if Match:

+ IncPath = Match.group(1)

+ TmpPath = os.path.join(CurDir, IncPath)

+ OrgPath = TmpPath

+ if not os.path.exists(TmpPath):

+ CurDir = os.path.join(os.path.dirname(

+ os.path.realpath(__file__)), "..", "..")

+ TmpPath = os.path.join(CurDir, IncPath)

+ if not os.path.exists(TmpPath):

+ raise Exception("ERROR: Cannot open include file '%s'." %

+ OrgPath)

+ else:

+ NewLines.append(('# Included from file: %s\n' %

+ IncPath, TmpPath, 0))

+ NewLines.append(('# %s\n' % ('=' * 80), TmpPath, 0))

+ NewLines.extend(CGenCfgData.ExpandIncludeFiles

+ (IncPath, CurDir))

+ else:

+ NewLines.append((Line, InputFilePath, LineNum))

+

+ return NewLines

+

+ def OverrideDefaultValue(self, DltFile):

+ Error = 0

+ DltLines = CGenCfgData.ExpandIncludeFiles(DltFile)

+

+ PlatformId = None

+ for Line, FilePath, LineNum in DltLines:

+ Line = Line.strip()

+ if not Line or Line.startswith('#'):

+ continue

+ Match = re.match("\\s*(\\w+)\\.(\\w+)(\\.\\w+)?\\s*\\|\\s*(.+)",

+ Line)

+ if not Match:

+ raise Exception("Unrecognized line '%s' (File:'%s' Line:%d) !"

+ % (Line, FilePath, LineNum + 1))

+

+ Found = False

+ InScope = False

+ for Idx, Item in enumerate(self._CfgItemList):

+ if not InScope:

+ if not (Item['embed'].endswith(':START') and

+ Item['embed'].startswith(Match.group(1))):

+ continue

+ InScope = True

+ if Item['cname'] == Match.group(2):

+ Found = True

+ break

+ if Item['embed'].endswith(':END') and \

+ Item['embed'].startswith(Match.group(1)):

+ break

+ Name = '%s.%s' % (Match.group(1), Match.group(2))

+ if not Found:

+ ErrItem = Match.group(2) if InScope else Match.group(1)

+ raise Exception("Invalid configuration '%s' in '%s' \

+(File:'%s' Line:%d) !" % (ErrItem, Name, FilePath, LineNum + 1))

+

+ ValueStr = Match.group(4).strip()

+ if Match.group(3) is not None:

+ # This is a subregion item

+ BitField = Match.group(3)[1:]

+ Found = False

+ if len(Item['subreg']) > 0:

+ for SubItem in Item['subreg']:

+ if SubItem['cname'] == '%s_%s' % \

+ (Item['cname'], BitField):

+ Found = True

+ break

+ if not Found:

+ raise Exception("Invalid configuration bit field \

+'%s' in '%s.%s' (File:'%s' Line:%d) !" % (BitField, Name, BitField,

+ FilePath, LineNum + 1))

+

+ try:

+ Value = int(ValueStr, 16) if ValueStr.startswith('0x') \

+ else int(ValueStr, 10)

+ except Exception:

+ raise Exception("Invalid value '%s' for bit field '%s.%s' \

+(File:'%s' Line:%d) !" % (ValueStr, Name, BitField, FilePath, LineNum + 1))

+

+ if Value >= 2 ** SubItem['bitlength']:

+ raise Exception("Invalid configuration bit field value \

+'%s' for '%s.%s' (File:'%s' Line:%d) !" % (Value, Name, BitField,

+ FilePath, LineNum + 1))

+

+ ValArray = self.ValueToByteArray(Item['value'], Item['length'])

+ self.UpdateBsfBitFields(SubItem, Value, ValArray)

+

+ if Item['value'].startswith('{'):

+ Item['value'] = '{' + ', '.join('0x%02X' % i

+ for i in ValArray) + '}'

+ else:

+ BitsValue = ''.join('{0:08b}'.format(i)

+ for i in ValArray[::-1])

+ Item['value'] = '0x%X' % (int(BitsValue, 2))

+ else:

+ if Item['value'].startswith('{') and \

+ not ValueStr.startswith('{'):

+ raise Exception("Data array required for '%s' \

+(File:'%s' Line:%d) !" % (Name, FilePath, LineNum + 1))

+ Item['value'] = ValueStr

+

+ if Name == 'PLATFORMID_CFG_DATA.PlatformId':

+ PlatformId = ValueStr

+

+ if (PlatformId is None) and (self.Mode != 'FSP'):

+ raise Exception("PLATFORMID_CFG_DATA.PlatformId is missing \

+in file '%s' !" % (DltFile))

+

+ return Error

+

+ def ProcessMultilines(self, String, MaxCharLength):

+ Multilines = ''

+ StringLength = len(String)

+ CurrentStringStart = 0

+ StringOffset = 0

+ BreakLineDict = []

+ if len(String) <= MaxCharLength:

+ while (StringOffset < StringLength):

+ if StringOffset >= 1:

+ if String[StringOffset - 1] == '\\' and \

+ String[StringOffset] == 'n':

+ BreakLineDict.append(StringOffset + 1)

+ StringOffset += 1

+ if BreakLineDict != []:

+ for Each in BreakLineDict:

+ Multilines += " %s\n" % String[CurrentStringStart:Each].\

+ lstrip()

+ CurrentStringStart = Each

+ if StringLength - CurrentStringStart > 0:

+ Multilines += " %s\n" % String[CurrentStringStart:].\

+ lstrip()

+ else:

+ Multilines = " %s\n" % String

+ else:

+ NewLineStart = 0

+ NewLineCount = 0

+ FoundSpaceChar = False

+ while(StringOffset < StringLength):

+ if StringOffset >= 1:

+ if NewLineCount >= MaxCharLength - 1:

+ if String[StringOffset] == ' ' and \

+ StringLength - StringOffset > 10:

+ BreakLineDict.append(NewLineStart + NewLineCount)

+ NewLineStart = NewLineStart + NewLineCount

+ NewLineCount = 0

+ FoundSpaceChar = True

+ elif StringOffset == StringLength - 1 \

+ and FoundSpaceChar is False:

+ BreakLineDict.append(0)

+ if String[StringOffset - 1] == '\\' and \

+ String[StringOffset] == 'n':

+ BreakLineDict.append(StringOffset + 1)

+ NewLineStart = StringOffset + 1

+ NewLineCount = 0

+ StringOffset += 1

+ NewLineCount += 1

+ if BreakLineDict != []:

+ BreakLineDict.sort()

+ for Each in BreakLineDict:

+ if Each > 0:

+ Multilines += " %s\n" % String[

+ CurrentStringStart:Each].lstrip()

+ CurrentStringStart = Each

+ if StringLength - CurrentStringStart > 0:

+ Multilines += " %s\n" % String[CurrentStringStart:].\

+ lstrip()

+ return Multilines

+

+ def CreateField(self, Item, Name, Length, Offset, Struct,

+ BsfName, Help, Option, BitsLength=None):

+ PosName = 28

+ NameLine = ''

+ HelpLine = ''

+ OptionLine = ''

+

+ if Length == 0 and Name == 'Dummy':

+ return '\n'

+

+ IsArray = False

+ if Length in [1, 2, 4, 8]:

+ Type = "UINT%d" % (Length * 8)

+ else:

+ IsArray = True

+ Type = "UINT8"

+

+ if Item and Item['value'].startswith('{'):

+ Type = "UINT8"

+ IsArray = True

+

+ if Struct != '':

+ Type = Struct

+ if Struct in ['UINT8', 'UINT16', 'UINT32', 'UINT64']:

+ IsArray = True

+ Unit = int(Type[4:]) // 8

+ Length = Length / Unit

+ else:

+ IsArray = False

+

+ if IsArray:

+ Name = Name + '[%d]' % Length

+

+ if len(Type) < PosName:

+ Space1 = PosName - len(Type)

+ else:

+ Space1 = 1

+

+ if BsfName != '':

+ NameLine = " %s\n" % BsfName

+ else:

+ NameLine = "\n"

+

+ if Help != '':

+ HelpLine = self.ProcessMultilines(Help, 80)

+

+ if Option != '':

+ OptionLine = self.ProcessMultilines(Option, 80)

+

+ if BitsLength is None:

+ BitsLength = ''

+ else:

+ BitsLength = ' : %d' % BitsLength

+

+ return "\n/** %s%s%s**/\n %s%s%s%s;\n" % \

+ (NameLine, HelpLine, OptionLine, Type, ' ' * Space1, Name,

+ BitsLength)

+

+ def SplitTextBody(self, TextBody):

+ Marker1 = '{ /* _COMMON_STRUCT_START_ */'

+ Marker2 = '; /* _COMMON_STRUCT_END_ */'

+ ComBody = []

+ TxtBody = []

+ IsCommon = False

+ for Line in TextBody:

+ if Line.strip().endswith(Marker1):

+ Line = Line.replace(Marker1[1:], '')

+ IsCommon = True

+ if Line.strip().endswith(Marker2):

+ Line = Line.replace(Marker2[1:], '')

+ if IsCommon:

+ ComBody.append(Line)

+ IsCommon = False

+ continue

+ if IsCommon:

+ ComBody.append(Line)

+ else:

+ TxtBody.append(Line)

+ return ComBody, TxtBody

+

+ def GetStructArrayInfo(self, Input):

+ ArrayStr = Input.split('[')

+ Name = ArrayStr[0]

+ if len(ArrayStr) > 1:

+ NumStr = ''.join(c for c in ArrayStr[-1] if c.isdigit())

+ NumStr = '1000' if len(NumStr) == 0 else NumStr

+ ArrayNum = int(NumStr)

+ else:

+ ArrayNum = 0

+ return Name, ArrayNum

+

+ def PostProcessBody(self, TextBody, IncludeEmbedOnly=True):

+ NewTextBody = []

+ OldTextBody = []

+ IncTextBody = []

+ StructBody = []

+ IncludeLine = False

+ EmbedFound = False

+ StructName = ''

+ ArrayVarName = ''

+ VariableName = ''

+ Count = 0

+ Level = 0

+ IsCommonStruct = False

+

+ for Line in TextBody:

+ if Line.startswith('#define '):

+ IncTextBody.append(Line)

+ continue

+

+ if not Line.startswith('/* EMBED_STRUCT:'):

+ Match = False

+ else:

+ Match = re.match("^/\\*\\sEMBED_STRUCT:([\\w\\[\\]\\*]+):\

+([\\w\\[\\]\\*]+):(\\w+):(START|END)([\\s\\d]+)\\*/([\\s\\S]*)", Line)

+

+ if Match:

+ ArrayMarker = Match.group(5)

+ if Match.group(4) == 'END':

+ Level -= 1

+ if Level == 0:

+ Line = Match.group(6)

+ else: # 'START'

+ Level += 1

+ if Level == 1:

+ Line = Match.group(6)

+ else:

+ EmbedFound = True

+ TagStr = Match.group(3)

+ if TagStr.startswith('TAG_'):

+ try:

+ TagVal = int(TagStr[4:], 16)

+ except Exception:

+ TagVal = -1

+ if (TagVal >= 0) and (TagVal < self._MinCfgTagId):

+ IsCommonStruct = True

+

+ if Level == 1:

+ if IsCommonStruct:

+ Suffix = ' /* _COMMON_STRUCT_START_ */'

+ else:

+ Suffix = ''

+ StructBody = ['typedef struct {%s' % Suffix]

+ StructName = Match.group(1)

+ StructType = Match.group(2)

+ VariableName = Match.group(3)

+ MatchOffset = re.search('/\\*\\*\\sOffset\\s0x\

+([a-fA-F0-9]+)', Line)

+ if MatchOffset:

+ Offset = int(MatchOffset.group(1), 16)

+ else:

+ Offset = None

+ IncludeLine = True

+

+ ModifiedStructType = StructType.rstrip()

+ if ModifiedStructType.endswith(']'):

+ Idx = ModifiedStructType.index('[')

+ if ArrayMarker != ' ':

+ # Auto array size

+ OldTextBody.append('')

+ ArrayVarName = VariableName

+ if int(ArrayMarker) == 1000:

+ Count = 1

+ else:

+ Count = int(ArrayMarker) + 1000

+ else:

+ if Count < 1000:

+ Count += 1

+

+ VariableTemp = ArrayVarName + '[%d]' % (

+ Count if Count < 1000 else Count - 1000)

+ OldTextBody[-1] = self.CreateField(

+ None, VariableTemp, 0, Offset,

+ ModifiedStructType[:Idx], '',

+ 'Structure Array', '')

+ else:

+ ArrayVarName = ''

+ OldTextBody.append(self.CreateField(

+ None, VariableName, 0, Offset,

+ ModifiedStructType, '', '', ''))

+

+ if IncludeLine:

+ StructBody.append(Line)

+ else:

+ OldTextBody.append(Line)

+

+ if Match and Match.group(4) == 'END':

+ if Level == 0:

+ if (StructType != Match.group(2)) or \

+ (VariableName != Match.group(3)):

+ print("Unmatched struct name '%s' and '%s' !" %

+ (StructName, Match.group(2)))

+ else:

+ if IsCommonStruct:

+ Suffix = ' /* _COMMON_STRUCT_END_ */'

+ else:

+ Suffix = ''

+ Line = '} %s;%s\n\n\n' % (StructName, Suffix)

+ StructBody.append(Line)

+ if (Line not in NewTextBody) and \

+ (Line not in OldTextBody):

+ NewTextBody.extend(StructBody)

+ IncludeLine = False

+ IsCommonStruct = False

+

+ if not IncludeEmbedOnly:

+ NewTextBody.extend(OldTextBody)

+

+ if EmbedFound:

+ NewTextBody = self.PostProcessBody(NewTextBody, False)

+

+ NewTextBody = IncTextBody + NewTextBody

+ return NewTextBody

+

+ def WriteHeaderFile(self, TxtBody, FileName, Type='h'):

+ FileNameDef = os.path.basename(FileName).replace('.', '_')

+ FileNameDef = re.sub('(.)([A-Z][a-z]+)', r'\1_\2', FileNameDef)

+ FileNameDef = re.sub('([a-z0-9])([A-Z])', r'\1_\2',

+ FileNameDef).upper()

+

+ Lines = []

+ Lines.append("%s\n" % GetCopyrightHeader(Type))

+ Lines.append("#ifndef __%s__\n" % FileNameDef)

+ Lines.append("#define __%s__\n\n" % FileNameDef)

+ if Type == 'h':

+ Lines.append("#pragma pack(1)\n\n")

+ Lines.extend(TxtBody)

+ if Type == 'h':

+ Lines.append("#pragma pack()\n\n")

+ Lines.append("#endif\n")

+

+ # Don't rewrite if the contents are the same

+ Create = True

+ if os.path.exists(FileName):

+ HdrFile = open(FileName, "r")

+ OrgTxt = HdrFile.read()

+ HdrFile.close()

+

+ NewTxt = ''.join(Lines)

+ if OrgTxt == NewTxt:

+ Create = False

+

+ if Create:

+ HdrFile = open(FileName, "w")

+ HdrFile.write(''.join(Lines))

+ HdrFile.close()

+

+ def CreateHeaderFile(self, HdrFileName, ComHdrFileName=''):

+ LastStruct = ''

+ SpaceIdx = 0

+ Offset = 0

+ FieldIdx = 0

+ LastFieldIdx = 0

+ ResvOffset = 0

+ ResvIdx = 0

+ TxtBody = []

+ LineBuffer = []

+ CfgTags = []

+ LastVisible = True

+

+ TxtBody.append("typedef struct {\n")

+ for Item in self._CfgItemList:

+ # Search for CFGDATA tags

+ Embed = Item["embed"].upper()

+ if Embed.endswith(':START'):

+ Match = re.match(r'(\w+)_CFG_DATA:TAG_([0-9A-F]+):START',

+ Embed)

+ if Match:

+ TagName = Match.group(1)

+ TagId = int(Match.group(2), 16)

+ CfgTags.append((TagId, TagName))

+

+ # Only process visible items

+ NextVisible = LastVisible

+

+ if LastVisible and (Item['header'] == 'OFF'):

+ NextVisible = False

+ ResvOffset = Item['offset']

+ elif (not LastVisible) and Item['header'] == 'ON':

+ NextVisible = True

+ Name = "ReservedUpdSpace%d" % ResvIdx

+ ResvIdx = ResvIdx + 1

+ TxtBody.append(self.CreateField(

+ Item, Name, Item["offset"] - ResvOffset,

+ ResvOffset, '', '', '', ''))

+ FieldIdx += 1

+

+ if Offset < Item["offset"]:

+ if LastVisible:

+ Name = "UnusedUpdSpace%d" % SpaceIdx

+ LineBuffer.append(self.CreateField

+ (Item, Name, Item["offset"] -

+ Offset, Offset, '', '', '', ''))

+ FieldIdx += 1

+ SpaceIdx = SpaceIdx + 1

+ Offset = Item["offset"]

+

+ LastVisible = NextVisible

+

+ Offset = Offset + Item["length"]

+ if LastVisible:

+ for Each in LineBuffer:

+ TxtBody.append(Each)

+ LineBuffer = []

+ Embed = Item["embed"].upper()

+ if Embed.endswith(':START') or Embed.endswith(':END'):

+ # EMBED_STRUCT: StructName : \

+ # ItemName : VariableName : START|END

+ Name, ArrayNum = self.GetStructArrayInfo(Item["struct"])

+ Remaining = Item["embed"]

+ if (LastFieldIdx + 1 == FieldIdx) and (LastStruct == Name):

+ ArrayMarker = ' '

+ else:

+ ArrayMarker = '%d' % ArrayNum

+ LastFieldIdx = FieldIdx

+ LastStruct = Name

+ Marker = '/* EMBED_STRUCT:%s:%s%s*/ ' % (Name, Remaining,

+ ArrayMarker)

+ # if Embed.endswith(':START') and Comment != '':

+ # Marker = '/* COMMENT:%s */ \n' % Item["comment"] + Marker

+ else:

+ if Embed == '':

+ Marker = ''

+ else:

+ self.Error = "Invalid embedded structure \

+format '%s'!\n" % Item["embed"]

+ return 4

+

+ # Generate bit fields for structure

+ if len(Item['subreg']) > 0 and Item["struct"]:

+ StructType = Item["struct"]

+ StructName, ArrayNum = self.GetStructArrayInfo(StructType)

+ if (LastFieldIdx + 1 == FieldIdx) and \

+ (LastStruct == Item["struct"]):

+ ArrayMarker = ' '

+ else:

+ ArrayMarker = '%d' % ArrayNum

+ TxtBody.append('/* EMBED_STRUCT:%s:%s:%s:START%s*/\n' %

+ (StructName, StructType, Item["cname"],

+ ArrayMarker))

+ for SubItem in Item['subreg']:

+ Name = SubItem["cname"]

+ if Name.startswith(Item["cname"]):

+ Name = Name[len(Item["cname"]) + 1:]

+ Line = self.CreateField(

+ SubItem, Name, SubItem["bitunit"],

+ SubItem["offset"], SubItem['struct'],

+ SubItem['name'], SubItem['help'],

+ SubItem['option'], SubItem['bitlength'])

+ TxtBody.append(Line)

+ TxtBody.append('/* EMBED_STRUCT:%s:%s:%s:END%s*/\n' %

+ (StructName, StructType, Item["cname"],

+ ArrayMarker))

+ LastFieldIdx = FieldIdx

+ LastStruct = Item["struct"]

+ FieldIdx += 1

+ else:

+ FieldIdx += 1

+ Line = Marker + self.CreateField(

+ Item, Item["cname"], Item["length"], Item["offset"],

+ Item['struct'], Item['name'], Item['help'],

+ Item['option'])

+ TxtBody.append(Line)

+

+ TxtBody.append("}\n\n")

+

+ # Handle the embedded data structure

+ TxtBody = self.PostProcessBody(TxtBody)

+ ComBody, TxtBody = self.SplitTextBody(TxtBody)

+

+ # Prepare TAG defines

+ PltTagDefTxt = ['\n']

+ ComTagDefTxt = ['\n']

+ for TagId, TagName in sorted(CfgTags):

+ TagLine = '#define %-30s 0x%03X\n' % ('CDATA_%s_TAG' %

+ TagName, TagId)

+ if TagId < self._MinCfgTagId:

+ # TAG ID < 0x100, it is a generic TAG

+ ComTagDefTxt.append(TagLine)

+ else:

+ PltTagDefTxt.append(TagLine)

+ PltTagDefTxt.append('\n\n')

+ ComTagDefTxt.append('\n\n')

+

+ # Write file back

+ self.WriteHeaderFile(PltTagDefTxt + TxtBody, HdrFileName)

+ if ComHdrFileName:

+ self.WriteHeaderFile(ComTagDefTxt + ComBody, ComHdrFileName)

+

+ return 0

+

+ def UpdateConfigItemValue(self, Item, ValueStr):

+ IsArray = True if Item['value'].startswith('{') else False

+ IsString = True if Item['value'].startswith("'") else False

+ Bytes = self.ValueToByteArray(ValueStr, Item['length'])

+ if IsString:

+ NewValue = "'%s'" % Bytes.decode("utf-8")

+ elif IsArray:

+ NewValue = Bytes2Str(Bytes)

+ else:

+ Fmt = '0x%X' if Item['value'].startswith('0x') else '%d'

+ NewValue = Fmt % Bytes2Val(Bytes)

+ Item['value'] = NewValue

+

+ def LoadDefaultFromBinaryArray(self, BinDat, IgnoreFind=False):

+ FindOff = 0

+ StartOff = 0

+ for Item in self._CfgItemList:

+ if Item['length'] == 0:

+ continue

+ if not IgnoreFind and Item['find']:

+ FindBin = Item['find'].encode()

+ Offset = BinDat.find(FindBin)

+ if Offset >= 0:

+ TestOff = BinDat[Offset+len(FindBin):].find(FindBin)

+ if TestOff >= 0:

+ raise Exception('Multiple match found for "%s" !' %

+ Item['find'])

+ FindOff = Offset + len(FindBin)

+ StartOff = Item['offset']

+ else:

+ raise Exception('Could not find "%s" !' % Item['find'])

+ if Item['offset'] + Item['length'] > len(BinDat):

+ raise Exception('Mismatching format between DSC \

+and BIN files !')

+ Offset = FindOff + (Item['offset'] - StartOff)

+ ValStr = Bytes2Str(BinDat[Offset: Offset + Item['length']])

+ self.UpdateConfigItemValue(Item, ValStr)

+

+ self.UpdateDefaultValue()

+

+ def PatchBinaryArray(self, BinDat):

+ FileOff = 0

+ Offset = 0

+ FindOff = 0

+

+ PatchList = []

+ CfgBin = bytearray()

+ for Item in self._CfgItemList:

+ if Item['length'] == 0:

+ continue

+

+ if Item['find']:

+ if len(CfgBin) > 0:

+ PatchList.append((FileOff, CfgBin))

+ FindBin = Item['find'].encode()

+ FileOff = BinDat.find(FindBin)

+ if FileOff < 0:

+ raise Exception('Could not find "%s" !' % Item['find'])

+ else:

+ TestOff = BinDat[FileOff+len(FindBin):].find(FindBin)

+ if TestOff >= 0:

+ raise Exception('Multiple match found for "%s" !' %

+ Item['find'])

+ FileOff += len(FindBin)

+ Offset = Item['offset']

+ FindOff = Offset

+ CfgBin = bytearray()

+

+ if Item['offset'] > Offset:

+ Gap = Item['offset'] - Offset

+ CfgBin.extend(b'\x00' * Gap)

+

+ if Item['type'] == 'Reserved' and Item['option'] == '$SKIP':

+ # keep old data

+ NewOff = FileOff + (Offset - FindOff)

+ FileData = bytearray(BinDat[NewOff: NewOff + Item['length']])

+ CfgBin.extend(FileData)

+ else:

+ CfgBin.extend(self.ValueToByteArray(Item['value'],

+ Item['length']))

+ Offset = Item['offset'] + Item['length']

+

+ if len(CfgBin) > 0:

+ PatchList.append((FileOff, CfgBin))

+

+ for FileOff, CfgBin in PatchList:

+ Length = len(CfgBin)

+ if FileOff + Length < len(BinDat):

+ BinDat[FileOff:FileOff+Length] = CfgBin[:]

+

+ return BinDat

+

+ def GenerateBinaryArray(self):

+ Offset = 0

+ BinDat = bytearray()

+ for Item in self._CfgItemList:

+ if Item['offset'] > Offset:

+ Gap = Item['offset'] - Offset

+ BinDat.extend(b'\x00' * Gap)

+ BinDat.extend(self.ValueToByteArray(Item['value'], Item['length']))

+ Offset = Item['offset'] + Item['length']

+ return BinDat

+

+ def GenerateBinary(self, BinFileName):

+ BinFile = open(BinFileName, "wb")

+ BinFile.write(self.GenerateBinaryArray())

+ BinFile.close()

+ return 0

+

+ def GenerateDataIncFile(self, DatIncFileName, BinFile=None):

+ # Put a prefix GUID before CFGDATA so that it can be located later on

+ Prefix = b'\xa7\xbd\x7f\x73\x20\x1e\x46\xd6\xbe\x8f\

+x64\x12\x05\x8d\x0a\xa8'

+ if BinFile:

+ Fin = open(BinFile, 'rb')

+ BinDat = Prefix + bytearray(Fin.read())

+ Fin.close()

+ else:

+ BinDat = Prefix + self.GenerateBinaryArray()

+

+ FileName = os.path.basename(DatIncFileName).upper()

+ FileName = FileName.replace('.', '_')

+

+ TxtLines = []

+

+ TxtLines.append("UINT8 mConfigDataBlob[%d] = {\n" % len(BinDat))

+ Count = 0

+ Line = [' ']

+ for Each in BinDat:

+ Line.append('0x%02X, ' % Each)

+ Count = Count + 1

+ if (Count & 0x0F) == 0:

+ Line.append('\n')

+ TxtLines.append(''.join(Line))

+ Line = [' ']

+ if len(Line) > 1:

+ TxtLines.append(''.join(Line) + '\n')

+

+ TxtLines.append("};\n\n")

+

+ self.WriteHeaderFile(TxtLines, DatIncFileName, 'inc')

+

+ return 0

+

+ def CheckCfgData(self):

+ # Check if CfgData contains any duplicated name

+ def AddItem(Item, ChkList):

+ Name = Item['cname']

+ if Name in ChkList:

+ return Item

+ if Name not in ['Dummy', 'Reserved', 'CfgHeader', 'CondValue']:

+ ChkList.append(Name)

+ return None

+

+ Duplicate = None

+ ChkList = []

+ for Item in self._CfgItemList:

+ Duplicate = AddItem(Item, ChkList)

+ if not Duplicate:

+ for SubItem in Item['subreg']:

+ Duplicate = AddItem(SubItem, ChkList)

+ if Duplicate:

+ break

+ if Duplicate:

+ break

+ if Duplicate:

+ self.Error = "Duplicated CFGDATA '%s' found !\n" % \

+ Duplicate['cname']

+ return -1

+ return 0

+

+ def PrintData(self):

+ for Item in self._CfgItemList:

+ if not Item['length']:

+ continue

+ print("%-10s @Offset:0x%04X Len:%3d Val:%s" %

+ (Item['cname'], Item['offset'], Item['length'],

+ Item['value']))

+ for SubItem in Item['subreg']:

+ print(" %-20s BitOff:0x%04X BitLen:%-3d Val:%s" %

+ (SubItem['cname'], SubItem['bitoffset'],

+ SubItem['bitlength'], SubItem['value']))

+

+ def FormatArrayValue(self, Input, Length):

+ Dat = self.ValueToByteArray(Input, Length)

+ return ','.join('0x%02X' % Each for Each in Dat)

+

+ def GetItemOptionList(self, Item):

+ TmpList = []

+ if Item['type'] == "Combo":

+ if not Item['option'] in self._BuidinOption:

+ OptList = Item['option'].split(',')

+ for Option in OptList:

+ Option = Option.strip()

+ try:

+ (OpVal, OpStr) = Option.split(':')

+ except Exception:

+ raise Exception("Invalide option format '%s' !" %

+ Option)

+ TmpList.append((OpVal, OpStr))

+ return TmpList

+

+ def WriteBsfStruct(self, BsfFd, Item):

+ if Item['type'] == "None":

+ Space = "gPlatformFspPkgTokenSpaceGuid"

+ else:

+ Space = Item['space']

+ Line = " $%s_%s" % (Space, Item['cname'])

+ Match = re.match("\\s*(\\{.+\\})\\s*", Item['value'])

+ if Match:

+ DefaultValue = self.FormatArrayValue(Match.group(1).strip(),

+ Item['length'])

+ else:

+ DefaultValue = Item['value'].strip()

+ if 'bitlength' in Item:

+ if Item['bitlength']:

+ BsfFd.write(" %s%s%4d bits $_DEFAULT_ = %s\n" %

+ (Line, ' ' * (64 - len(Line)), Item['bitlength'],

+ DefaultValue))

+ else:

+ if Item['length']:

+ BsfFd.write(" %s%s%4d bytes $_DEFAULT_ = %s\n" %

+ (Line, ' ' * (64 - len(Line)), Item['length'],

+ DefaultValue))

+

+ return self.GetItemOptionList(Item)

+

+ def GetBsfOption(self, OptionName):

+ if OptionName in self._CfgOptsDict:

+ return self._CfgOptsDict[OptionName]

+ else:

+ return OptionName

+

+ def WriteBsfOption(self, BsfFd, Item):

+ PcdName = Item['space'] + '_' + Item['cname']

+ WriteHelp = 0

+ BsfLines = []

+ if Item['type'] == "Combo":

+ if Item['option'] in self._BuidinOption:

+ Options = self._BuidinOption[Item['option']]

+ else:

+ Options = self.GetBsfOption(PcdName)

+ BsfLines.append(' %s $%s, "%s", &%s,\n' % (

+ Item['type'], PcdName, Item['name'], Options))

+ WriteHelp = 1

+ elif Item['type'].startswith("EditNum"):

+ Match = re.match("EditNum\\s*,\\s*(HEX|DEC)\\s*,\\s*\\(\

+(\\d+|0x[0-9A-Fa-f]+)\\s*,\\s*(\\d+|0x[0-9A-Fa-f]+)\\)", Item['type'])

+ if Match:

+ BsfLines.append(' EditNum $%s, "%s", %s,\n' % (

+ PcdName, Item['name'], Match.group(1)))

+ WriteHelp = 2

+ elif Item['type'].startswith("EditText"):

+ BsfLines.append(' %s $%s, "%s",\n' % (Item['type'], PcdName,

+ Item['name']))

+ WriteHelp = 1

+ elif Item['type'] == "Table":

+ Columns = Item['option'].split(',')

+ if len(Columns) != 0:

+ BsfLines.append(' %s $%s "%s",' % (Item['type'], PcdName,

+ Item['name']))

+ for Col in Columns:

+ Fmt = Col.split(':')

+ if len(Fmt) != 3:

+ raise Exception("Column format '%s' is invalid !" %

+ Fmt)

+ try:

+ Dtype = int(Fmt[1].strip())

+ except Exception:

+ raise Exception("Column size '%s' is invalid !" %

+ Fmt[1])

+ BsfLines.append('\n Column "%s", %d bytes, %s' %

+ (Fmt[0].strip(), Dtype, Fmt[2].strip()))

+ BsfLines.append(',\n')

+ WriteHelp = 1

+

+ if WriteHelp > 0:

+ HelpLines = Item['help'].split('\\n\\r')

+ FirstLine = True

+ for HelpLine in HelpLines:

+ if FirstLine:

+ FirstLine = False

+ BsfLines.append(' Help "%s"\n' % (HelpLine))

+ else:

+ BsfLines.append(' "%s"\n' % (HelpLine))

+ if WriteHelp == 2:

+ BsfLines.append(' "Valid range: %s ~ %s"\n' %

+ (Match.group(2), Match.group(3)))

+

+ if len(Item['condition']) > 4:

+ CondList = Item['condition'].split(',')

+ Idx = 0

+ for Cond in CondList:

+ Cond = Cond.strip()

+ if Cond.startswith('#'):

+ BsfLines.insert(Idx, Cond + '\n')

+ Idx += 1

+ elif Cond.startswith('@#'):

+ BsfLines.append(Cond[1:] + '\n')

+

+ for Line in BsfLines:

+ BsfFd.write(Line)

+

+ def WriteBsfPages(self, PageTree, BsfFd):

+ BsfFd.write('\n')

+ Key = next(iter(PageTree))

+ for Page in PageTree[Key]:

+ PageName = next(iter(Page))

+ BsfFd.write('Page "%s"\n' % self._CfgPageDict[PageName])

+ if len(PageTree[Key]):

+ self.WriteBsfPages(Page, BsfFd)

+

+ BsfItems = []

+ for Item in self._CfgItemList:

+ if Item['name'] != '':

+ if Item['page'] != PageName:

+ continue

+ if len(Item['subreg']) > 0:

+ for SubItem in Item['subreg']:

+ if SubItem['name'] != '':

+ BsfItems.append(SubItem)

+ else:

+ BsfItems.append(Item)

+

+ BsfItems.sort(key=lambda x: x['order'])

+

+ for Item in BsfItems:

+ self.WriteBsfOption(BsfFd, Item)

+ BsfFd.write("EndPage\n\n")

+

+ def GenerateBsfFile(self, BsfFile):

+

+ if BsfFile == '':

+ self.Error = "BSF output file '%s' is invalid" % BsfFile

+ return 1

+

+ Error = 0

+ OptionDict = {}

+ BsfFd = open(BsfFile, "w")

+ BsfFd.write("%s\n" % GetCopyrightHeader('bsf'))

+ BsfFd.write("%s\n" % self._GlobalDataDef)

+ BsfFd.write("StructDef\n")

+ NextOffset = -1

+ for Item in self._CfgItemList:

+ if Item['find'] != '':

+ BsfFd.write('\n Find "%s"\n' % Item['find'])

+ NextOffset = Item['offset'] + Item['length']

+ if Item['name'] != '':

+ if NextOffset != Item['offset']:

+ BsfFd.write(" Skip %d bytes\n" %

+ (Item['offset'] - NextOffset))

+ if len(Item['subreg']) > 0:

+ NextOffset = Item['offset']

+ BitsOffset = NextOffset * 8

+ for SubItem in Item['subreg']:

+ BitsOffset += SubItem['bitlength']

+ if SubItem['name'] == '':

+ if 'bitlength' in SubItem:

+ BsfFd.write(" Skip %d bits\n" %

+ (SubItem['bitlength']))

+ else:

+ BsfFd.write(" Skip %d bytes\n" %

+ (SubItem['length']))

+ else:

+ Options = self.WriteBsfStruct(BsfFd, SubItem)

+ if len(Options) > 0:

+ OptionDict[SubItem

+ ['space']+'_'+SubItem

+ ['cname']] = Options

+

+ NextBitsOffset = (Item['offset'] + Item['length']) * 8

+ if NextBitsOffset > BitsOffset:

+ BitsGap = NextBitsOffset - BitsOffset

+ BitsRemain = BitsGap % 8

+ if BitsRemain:

+ BsfFd.write(" Skip %d bits\n" % BitsRemain)

+ BitsGap -= BitsRemain

+ BytesRemain = BitsGap // 8

+ if BytesRemain:

+ BsfFd.write(" Skip %d bytes\n" %

+ BytesRemain)

+ NextOffset = Item['offset'] + Item['length']

+ else:

+ NextOffset = Item['offset'] + Item['length']

+ Options = self.WriteBsfStruct(BsfFd, Item)

+ if len(Options) > 0:

+ OptionDict[Item['space']+'_'+Item['cname']] = Options

+ BsfFd.write("\nEndStruct\n\n")

+

+ BsfFd.write("%s" % self._BuidinOptionTxt)

+

+ NameList = []

+ OptionList = []

+ for Each in sorted(OptionDict):

+ if OptionDict[Each] not in OptionList:

+ NameList.append(Each)

+ OptionList.append(OptionDict[Each])

+ BsfFd.write("List &%s\n" % Each)

+ for Item in OptionDict[Each]:

+ BsfFd.write(' Selection %s , "%s"\n' %

+ (self.EvaluateExpress(Item[0]), Item[1]))

+ BsfFd.write("EndList\n\n")

+ else:

+ # Item has idential options as other item

+ # Try to reuse the previous options instead

+ Idx = OptionList.index(OptionDict[Each])

+ self._CfgOptsDict[Each] = NameList[Idx]

+

+ BsfFd.write("BeginInfoBlock\n")

+ BsfFd.write(' PPVer "%s"\n' % (self._CfgBlkDict['ver']))

+ BsfFd.write(' Description "%s"\n' % (self._CfgBlkDict['name']))

+ BsfFd.write("EndInfoBlock\n\n")

+

+ self.WriteBsfPages(self._CfgPageTree, BsfFd)

+

+ BsfFd.close()

+ return Error

+

+ def WriteDeltaLine(self, OutLines, Name, ValStr, IsArray):

+ if IsArray:

+ Output = '%s | { %s }' % (Name, ValStr)

+ else:

+ Output = '%s | 0x%X' % (Name, Array2Val(ValStr))

+ OutLines.append(Output)

+

+ def WriteDeltaFile(self, OutFile, PlatformId, OutLines):

+ DltFd = open(OutFile, "w")

+ DltFd.write("%s\n" % GetCopyrightHeader('dlt', True))

+ if PlatformId is not None:

+ DltFd.write('#\n')

+ DltFd.write('# Delta configuration values \

+for platform ID 0x%04X\n' % PlatformId)

+ DltFd.write('#\n\n')

+ for Line in OutLines:

+ DltFd.write('%s\n' % Line)

+ DltFd.close()

+

+ def GenerateDeltaFile(self, OutFile, AbsfFile):

+ # Parse ABSF Build in dict

+ if not os.path.exists(AbsfFile):

+ Lines = []

+ else:

+ with open(AbsfFile) as Fin:

+ Lines = Fin.readlines()

+

+ AbsfBuiltValDict = {}

+ Process = False

+ for Line in Lines:

+ Line = Line.strip()

+ if Line.startswith('StructDef'):

+ Process = True

+ if Line.startswith('EndStruct'):

+ break

+ if not Process:

+ continue

+ Match = re.match('\\s*\\$gCfgData_(\\w+)\\s+\

+(\\d+)\\s+(bits|bytes)\\s+\\$_AS_BUILT_\\s+=\\s+(.+)\\$', Line)

+ if Match:

+ if Match.group(1) not in AbsfBuiltValDict:

+ AbsfBuiltValDict[Match.group(1)] = Match.group(4).strip()

+ else:

+ raise Exception("Duplicated configuration \

+name '%s' found !", Match.group(1))

+

+ # Match config item in DSC

+ PlatformId = None

+ OutLines = []

+ TagName = ''

+ Level = 0

+ for Item in self._CfgItemList:

+ Name = None

+ if Level == 0 and Item['embed'].endswith(':START'):

+ TagName = Item['embed'].split(':')[0]

+ Level += 1

+ if Item['cname'] in AbsfBuiltValDict:

+ ValStr = AbsfBuiltValDict[Item['cname']]

+ Name = '%s.%s' % (TagName, Item['cname'])

+ if not Item['subreg'] and Item['value'].startswith('{'):

+ Value = Array2Val(Item['value'])

+ IsArray = True

+ else:

+ Value = int(Item['value'], 16)

+ IsArray = False

+ AbsfVal = Array2Val(ValStr)

+ if AbsfVal != Value:

+ if 'PLATFORMID_CFG_DATA.PlatformId' == Name:

+ PlatformId = AbsfVal

+ self.WriteDeltaLine(OutLines, Name, ValStr, IsArray)

+ else:

+ if 'PLATFORMID_CFG_DATA.PlatformId' == Name:

+ raise Exception("'PlatformId' has the \

+same value as DSC default !")

+

+ if Item['subreg']:

+ for SubItem in Item['subreg']:

+ if SubItem['cname'] in AbsfBuiltValDict:

+ ValStr = AbsfBuiltValDict[SubItem['cname']]

+ if Array2Val(ValStr) == int(SubItem['value'], 16):

+ continue

+ Name = '%s.%s.%s' % (TagName, Item['cname'],

+ SubItem['cname'])

+ self.WriteDeltaLine(OutLines, Name, ValStr, False)

+

+ if Item['embed'].endswith(':END'):

+ Level -= 1

+

+ if PlatformId is None and Lines:

+ raise Exception("'PlatformId' configuration \

+is missing in ABSF file!")

+ else:

+ PlatformId = 0

+

+ self.WriteDeltaFile(OutFile, PlatformId, Lines)

+

+ return 0

+

+ def GenerateDscFile(self, OutFile):

+ DscFd = open(OutFile, "w")

+ for Line in self._DscLines:

+ DscFd.write(Line + '\n')

+ DscFd.close()

+ return 0

+

+

+def Usage():

+ print('\n'.join([

+ "GenCfgData Version 0.01",

+ "Usage:",

+ " GenCfgData GENINC BinFile \

+IncOutFile [-D Macros]",

+ " GenCfgData GENPKL DscFile \

+PklOutFile [-D Macros]",

+ " GenCfgData GENINC DscFile[;DltFile] \

+IncOutFile [-D Macros]",

+ " GenCfgData GENBIN DscFile[;DltFile] \

+BinOutFile [-D Macros]",

+ " GenCfgData GENBSF DscFile[;DltFile] \

+BsfOutFile [-D Macros]",

+ " GenCfgData GENDLT DscFile[;AbsfFile] \

+DltOutFile [-D Macros]",

+ " GenCfgData GENDSC DscFile \

+DscOutFile [-D Macros]",

+ " GenCfgData GENHDR DscFile[;DltFile] \

+HdrOutFile[;ComHdrOutFile] [-D Macros]"

+ ]))

+

+

+def Main():

+ #

+ # Parse the options and args

+ #

+ argc = len(sys.argv)

+ if argc < 4:

+ Usage()

+ return 1

+

+ GenCfgData = CGenCfgData()

+ Command = sys.argv[1].upper()

+ OutFile = sys.argv[3]

+

+ if argc > 5 and GenCfgData.ParseMacros(sys.argv[4:]) != 0:

+ raise Exception("ERROR: Macro parsing failed !")

+

+ FileList = sys.argv[2].split(';')

+ if len(FileList) == 2:

+ DscFile = FileList[0]

+ DltFile = FileList[1]

+ elif len(FileList) == 1:

+ DscFile = FileList[0]

+ DltFile = ''

+ else:

+ raise Exception("ERROR: Invalid parameter '%s' !" % sys.argv[2])

+

+ if Command == "GENDLT" and DscFile.endswith('.dlt'):

+ # It needs to expand an existing DLT file

+ DltFile = DscFile

+ Lines = CGenCfgData.ExpandIncludeFiles(DltFile)

+ OutTxt = ''.join([x[0] for x in Lines])

+ OutFile = open(OutFile, "w")

+ OutFile.write(OutTxt)

+ OutFile.close()

+ return 0

+

+ if not os.path.exists(DscFile):

+ raise Exception("ERROR: Cannot open file '%s' !" % DscFile)

+

+ CfgBinFile = ''

+ if DltFile:

+ if not os.path.exists(DltFile):

+ raise Exception("ERROR: Cannot open file '%s' !" % DltFile)

+ if Command == "GENDLT":

+ CfgBinFile = DltFile

+ DltFile = ''

+

+ BinFile = ''

+ if (DscFile.lower().endswith('.bin')) and (Command == "GENINC"):

+ # It is binary file

+ BinFile = DscFile

+ DscFile = ''

+

+ if BinFile:

+ if GenCfgData.GenerateDataIncFile(OutFile, BinFile) != 0:

+ raise Exception(GenCfgData.Error)

+ return 0

+

+ if DscFile.lower().endswith('.pkl'):

+ with open(DscFile, "rb") as PklFile:

+ GenCfgData.__dict__ = marshal.load(PklFile)

+ else:

+ if GenCfgData.ParseDscFile(DscFile) != 0:

+ raise Exception(GenCfgData.Error)

+

+ # if GenCfgData.CheckCfgData() != 0:

+ # raise Exception(GenCfgData.Error)

+

+ if GenCfgData.CreateVarDict() != 0:

+ raise Exception(GenCfgData.Error)

+

+ if Command == 'GENPKL':

+ with open(OutFile, "wb") as PklFile:

+ marshal.dump(GenCfgData.__dict__, PklFile)

+ return 0

+

+ if DltFile and Command in ['GENHDR', 'GENBIN', 'GENINC', 'GENBSF']:

+ if GenCfgData.OverrideDefaultValue(DltFile) != 0:

+ raise Exception(GenCfgData.Error)

+

+ if GenCfgData.UpdateDefaultValue() != 0:

+ raise Exception(GenCfgData.Error)

+

+ # GenCfgData.PrintData ()

+

+ if sys.argv[1] == "GENBIN":

+ if GenCfgData.GenerateBinary(OutFile) != 0:

+ raise Exception(GenCfgData.Error)

+

+ elif sys.argv[1] == "GENHDR":

+ OutFiles = OutFile.split(';')

+ BrdOutFile = OutFiles[0].strip()

+ if len(OutFiles) > 1:

+ ComOutFile = OutFiles[1].strip()

+ else:

+ ComOutFile = ''

+ if GenCfgData.CreateHeaderFile(BrdOutFile, ComOutFile) != 0:

+ raise Exception(GenCfgData.Error)

+

+ elif sys.argv[1] == "GENBSF":

+ if GenCfgData.GenerateBsfFile(OutFile) != 0:

+ raise Exception(GenCfgData.Error)

+

+ elif sys.argv[1] == "GENINC":

+ if GenCfgData.GenerateDataIncFile(OutFile) != 0:

+ raise Exception(GenCfgData.Error)

+

+ elif sys.argv[1] == "GENDLT":

+ if GenCfgData.GenerateDeltaFile(OutFile, CfgBinFile) != 0:

+ raise Exception(GenCfgData.Error)

+

+ elif sys.argv[1] == "GENDSC":

+ if GenCfgData.GenerateDscFile(OutFile) != 0:

+ raise Exception(GenCfgData.Error)

+

+ else:

+ raise Exception("Unsuported command '%s' !" % Command)

+

+ return 0

+

+

+if __name__ == '__main__':

+ sys.exit(Main())

--
2.28.0.windows.1


[PATCH v8] IntelFsp2Pkg: Add Config Editor tool support

 

This is a GUI interface that can be used by users who
would like to change configuration settings directly
from the interface without having to modify the source.

This tool depends on Python GUI tool kit Tkinter.
It runs on both Windows and Linux.

The user needs to load the YAML file along with DLT file
for a specific board into the ConfigEditor, change the desired
configuration values. Finally, generate a new configuration delta
file or a config binary blob for the newly changed values to take
effect. These will be the inputs to the merge tool or the stitch
tool so that new config changes can be merged and stitched into
the final configuration blob.

This tool also supports binary update directly and display FSP
information. It is also backward compatible for BSF file format.

Running Configuration Editor:
python ConfigEditor.py

Co-authored-by: Maurice Ma <maurice.ma@intel.com>
Cc: Maurice Ma <maurice.ma@intel.com>
Cc: Nate DeSimone <nathaniel.l.desimone@intel.com>
Cc: Star Zeng <star.zeng@intel.com>
Cc: Chasel Chiu <chasel.chiu@intel.com>
Signed-off-by: Loo Tung Lun <tung.lun.loo@intel.com>
---
IntelFsp2Pkg/Tools/ConfigEditor/CommonUtility.py | 504 ++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
++++++++++++
IntelFsp2Pkg/Tools/ConfigEditor/ConfigEditor.py | 1499 ++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
IntelFsp2Pkg/Tools/ConfigEditor/GenYamlCfg.py | 2252 ++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
++++++++++++++++++++++++++++++++++++
IntelFsp2Pkg/Tools/ConfigEditor/SingleSign.py | 324 ++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++
IntelFsp2Pkg/Tools/FspDscBsf2Yaml.py | 376 ++++++++++++++++++=
+++++++++++----------------------------------------------------------------=
------------------------------------------
IntelFsp2Pkg/Tools/FspGenCfgData.py | 2637 ++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++=
++++++++++++++++++++++++
6 files changed, 7295 insertions(+), 297 deletions(-)

diff --git a/IntelFsp2Pkg/Tools/ConfigEditor/CommonUtility.py b/IntelFsp2Pk=
g/Tools/ConfigEditor/CommonUtility.py
new file mode 100644
index 0000000000..1229279116
--- /dev/null
+++ b/IntelFsp2Pkg/Tools/ConfigEditor/CommonUtility.py
@@ -0,0 +1,504 @@
+#!/usr/bin/env python=0D
+# @ CommonUtility.py=0D
+# Common utility script=0D
+#=0D
+# Copyright (c) 2016 - 2021, Intel Corporation. All rights reserved.<BR>=0D
+# SPDX-License-Identifier: BSD-2-Clause-Patent=0D
+#=0D
+##=0D
+=0D
+import os=0D
+import sys=0D
+import shutil=0D
+import subprocess=0D
+import string=0D
+from ctypes import ARRAY, c_char, c_uint16, c_uint32, \=0D
+ c_uint8, Structure, sizeof=0D
+from importlib.machinery import SourceFileLoader=0D
+from SingleSign import single_sign_gen_pub_key=0D
+=0D
+=0D
+# Key types defined should match with cryptolib.h=0D
+PUB_KEY_TYPE =3D {=0D
+ "RSA": 1,=0D
+ "ECC": 2,=0D
+ "DSA": 3,=0D
+ }=0D
+=0D
+# Signing type schemes defined should match with cryptolib.h=0D
+SIGN_TYPE_SCHEME =3D {=0D
+ "RSA_PKCS1": 1,=0D
+ "RSA_PSS": 2,=0D
+ "ECC": 3,=0D
+ "DSA": 4,=0D
+ }=0D
+=0D
+# Hash values defined should match with cryptolib.h=0D
+HASH_TYPE_VALUE =3D {=0D
+ "SHA2_256": 1,=0D
+ "SHA2_384": 2,=0D
+ "SHA2_512": 3,=0D
+ "SM3_256": 4,=0D
+ }=0D
+=0D
+# Hash values defined should match with cryptolib.h=0D
+HASH_VAL_STRING =3D dict(map(reversed, HASH_TYPE_VALUE.items()))=0D
+=0D
+AUTH_TYPE_HASH_VALUE =3D {=0D
+ "SHA2_256": 1,=0D
+ "SHA2_384": 2,=0D
+ "SHA2_512": 3,=0D
+ "SM3_256": 4,=0D
+ "RSA2048SHA256": 1,=0D
+ "RSA3072SHA384": 2,=0D
+ }=0D
+=0D
+HASH_DIGEST_SIZE =3D {=0D
+ "SHA2_256": 32,=0D
+ "SHA2_384": 48,=0D
+ "SHA2_512": 64,=0D
+ "SM3_256": 32,=0D
+ }=0D
+=0D
+=0D
+class PUB_KEY_HDR (Structure):=0D
+ _pack_ =3D 1=0D
+ _fields_ =3D [=0D
+ ('Identifier', ARRAY(c_char, 4)), # signature ('P', 'U', 'B',=
'K')=0D
+ ('KeySize', c_uint16), # Length of Public Key=0D
+ ('KeyType', c_uint8), # RSA or ECC=0D
+ ('Reserved', ARRAY(c_uint8, 1)),=0D
+ ('KeyData', ARRAY(c_uint8, 0)),=0D
+ ]=0D
+=0D
+ def __init__(self):=0D
+ self.Identifier =3D b'PUBK'=0D
+=0D
+=0D
+class SIGNATURE_HDR (Structure):=0D
+ _pack_ =3D 1=0D
+ _fields_ =3D [=0D
+ ('Identifier', ARRAY(c_char, 4)),=0D
+ ('SigSize', c_uint16),=0D
+ ('SigType', c_uint8),=0D
+ ('HashAlg', c_uint8),=0D
+ ('Signature', ARRAY(c_uint8, 0)),=0D
+ ]=0D
+=0D
+ def __init__(self):=0D
+ self.Identifier =3D b'SIGN'=0D
+=0D
+=0D
+class LZ_HEADER(Structure):=0D
+ _pack_ =3D 1=0D
+ _fields_ =3D [=0D
+ ('signature', ARRAY(c_char, 4)),=0D
+ ('compressed_len', c_uint32),=0D
+ ('length', c_uint32),=0D
+ ('version', c_uint16),=0D
+ ('svn', c_uint8),=0D
+ ('attribute', c_uint8)=0D
+ ]=0D
+ _compress_alg =3D {=0D
+ b'LZDM': 'Dummy',=0D
+ b'LZ4 ': 'Lz4',=0D
+ b'LZMA': 'Lzma',=0D
+ }=0D
+=0D
+=0D
+def print_bytes(data, indent=3D0, offset=3D0, show_ascii=3DFalse):=0D
+ bytes_per_line =3D 16=0D
+ printable =3D ' ' + string.ascii_letters + string.digits + string.punc=
tuation=0D
+ str_fmt =3D '{:s}{:04x}: {:%ds} {:s}' % (bytes_per_line * 3)=0D
+ bytes_per_line=0D
+ data_array =3D bytearray(data)=0D
+ for idx in range(0, len(data_array), bytes_per_line):=0D
+ hex_str =3D ' '.join(=0D
+ '%02X' % val for val in data_array[idx:idx + bytes_per_line])=
=0D
+ asc_str =3D ''.join('%c' % (val if (chr(val) in printable) else '.=
')=0D
+ for val in data_array[idx:idx + bytes_per_line])=
=0D
+ print(str_fmt.format(=0D
+ indent * ' ',=0D
+ offset + idx, hex_str,=0D
+ ' ' + asc_str if show_ascii else ''))=0D
+=0D
+=0D
+def get_bits_from_bytes(bytes, start, length):=0D
+ if length =3D=3D 0:=0D
+ return 0=0D
+ byte_start =3D (start) // 8=0D
+ byte_end =3D (start + length - 1) // 8=0D
+ bit_start =3D start & 7=0D
+ mask =3D (1 << length) - 1=0D
+ val =3D bytes_to_value(bytes[byte_start:byte_end + 1])=0D
+ val =3D (val >> bit_start) & mask=0D
+ return val=0D
+=0D
+=0D
+def set_bits_to_bytes(bytes, start, length, bvalue):=0D
+ if length =3D=3D 0:=0D
+ return=0D
+ byte_start =3D (start) // 8=0D
+ byte_end =3D (start + length - 1) // 8=0D
+ bit_start =3D start & 7=0D
+ mask =3D (1 << length) - 1=0D
+ val =3D bytes_to_value(bytes[byte_start:byte_end + 1])=0D
+ val &=3D ~(mask << bit_start)=0D
+ val |=3D ((bvalue & mask) << bit_start)=0D
+ bytes[byte_start:byte_end+1] =3D value_to_bytearray(=0D
+ val,=0D
+ byte_end + 1 - byte_start)=0D
+=0D
+=0D
+def value_to_bytes(value, length):=0D
+ return value.to_bytes(length, 'little')=0D
+=0D
+=0D
+def bytes_to_value(bytes):=0D
+ return int.from_bytes(bytes, 'little')=0D
+=0D
+=0D
+def value_to_bytearray(value, length):=0D
+ return bytearray(value_to_bytes(value, length))=0D
+=0D
+# def value_to_bytearray (value, length):=0D
+ return bytearray(value_to_bytes(value, length))=0D
+=0D
+=0D
+def get_aligned_value(value, alignment=3D4):=0D
+ if alignment !=3D (1 << (alignment.bit_length() - 1)):=0D
+ raise Exception(=0D
+ 'Alignment (0x%x) should to be power of 2 !' % alignment)=0D
+ value =3D (value + (alignment - 1)) & ~(alignment - 1)=0D
+ return value=0D
+=0D
+=0D
+def get_padding_length(data_len, alignment=3D4):=0D
+ new_data_len =3D get_aligned_value(data_len, alignment)=0D
+ return new_data_len - data_len=0D
+=0D
+=0D
+def get_file_data(file, mode=3D'rb'):=0D
+ return open(file, mode).read()=0D
+=0D
+=0D
+def gen_file_from_object(file, object):=0D
+ open(file, 'wb').write(object)=0D
+=0D
+=0D
+def gen_file_with_size(file, size):=0D
+ open(file, 'wb').write(b'\xFF' * size)=0D
+=0D
+=0D
+def check_files_exist(base_name_list, dir=3D'', ext=3D''):=0D
+ for each in base_name_list:=0D
+ if not os.path.exists(os.path.join(dir, each + ext)):=0D
+ return False=0D
+ return True=0D
+=0D
+=0D
+def load_source(name, filepath):=0D
+ mod =3D SourceFileLoader(name, filepath).load_module()=0D
+ return mod=0D
+=0D
+=0D
+def get_openssl_path():=0D
+ if os.name =3D=3D 'nt':=0D
+ if 'OPENSSL_PATH' not in os.environ:=0D
+ openssl_dir =3D "C:\\Openssl\\bin\\"=0D
+ if os.path.exists(openssl_dir):=0D
+ os.environ['OPENSSL_PATH'] =3D openssl_dir=0D
+ else:=0D
+ os.environ['OPENSSL_PATH'] =3D "C:\\Openssl\\"=0D
+ if 'OPENSSL_CONF' not in os.environ:=0D
+ openssl_cfg =3D "C:\\Openssl\\openssl.cfg"=0D
+ if os.path.exists(openssl_cfg):=0D
+ os.environ['OPENSSL_CONF'] =3D openssl_cfg=0D
+ openssl =3D os.path.join(=0D
+ os.environ.get('OPENSSL_PATH', ''),=0D
+ 'openssl.exe')=0D
+ else:=0D
+ # Get openssl path for Linux cases=0D
+ openssl =3D shutil.which('openssl')=0D
+=0D
+ return openssl=0D
+=0D
+=0D
+def run_process(arg_list, print_cmd=3DFalse, capture_out=3DFalse):=0D
+ sys.stdout.flush()=0D
+ if os.name =3D=3D 'nt' and os.path.splitext(arg_list[0])[1] =3D=3D '' =
and \=0D
+ os.path.exists(arg_list[0] + '.exe'):=0D
+ arg_list[0] +=3D '.exe'=0D
+ if print_cmd:=0D
+ print(' '.join(arg_list))=0D
+=0D
+ exc =3D None=0D
+ result =3D 0=0D
+ output =3D ''=0D
+ try:=0D
+ if capture_out:=0D
+ output =3D subprocess.check_output(arg_list).decode()=0D
+ else:=0D
+ result =3D subprocess.call(arg_list)=0D
+ except Exception as ex:=0D
+ result =3D 1=0D
+ exc =3D ex=0D
+=0D
+ if result:=0D
+ if not print_cmd:=0D
+ print('Error in running process:\n %s' % ' '.join(arg_list))=
=0D
+ if exc is None:=0D
+ sys.exit(1)=0D
+ else:=0D
+ raise exc=0D
+=0D
+ return output=0D
+=0D
+=0D
+# Adjust hash type algorithm based on Public key file=0D
+def adjust_hash_type(pub_key_file):=0D
+ key_type =3D get_key_type(pub_key_file)=0D
+ if key_type =3D=3D 'RSA2048':=0D
+ hash_type =3D 'SHA2_256'=0D
+ elif key_type =3D=3D 'RSA3072':=0D
+ hash_type =3D 'SHA2_384'=0D
+ else:=0D
+ hash_type =3D None=0D
+=0D
+ return hash_type=0D
+=0D
+=0D
+def rsa_sign_file(=0D
+ priv_key, pub_key, hash_type, sign_scheme,=0D
+ in_file, out_file, inc_dat=3DFalse, inc_key=3DFalse):=0D
+=0D
+ bins =3D bytearray()=0D
+ if inc_dat:=0D
+ bins.extend(get_file_data(in_file))=0D
+=0D
+=0D
+# def single_sign_file(priv_key, hash_type, sign_scheme, in_file, out_file=
):=0D
+=0D
+ out_data =3D get_file_data(out_file)=0D
+=0D
+ sign =3D SIGNATURE_HDR()=0D
+ sign.SigSize =3D len(out_data)=0D
+ sign.SigType =3D SIGN_TYPE_SCHEME[sign_scheme]=0D
+ sign.HashAlg =3D HASH_TYPE_VALUE[hash_type]=0D
+=0D
+ bins.extend(bytearray(sign) + out_data)=0D
+ if inc_key:=0D
+ key =3D gen_pub_key(priv_key, pub_key)=0D
+ bins.extend(key)=0D
+=0D
+ if len(bins) !=3D len(out_data):=0D
+ gen_file_from_object(out_file, bins)=0D
+=0D
+=0D
+def get_key_type(in_key):=0D
+=0D
+ # Check in_key is file or key Id=0D
+ if not os.path.exists(in_key):=0D
+ key =3D bytearray(gen_pub_key(in_key))=0D
+ else:=0D
+ # Check for public key in binary format.=0D
+ key =3D bytearray(get_file_data(in_key))=0D
+=0D
+ pub_key_hdr =3D PUB_KEY_HDR.from_buffer(key)=0D
+ if pub_key_hdr.Identifier !=3D b'PUBK':=0D
+ pub_key =3D gen_pub_key(in_key)=0D
+ pub_key_hdr =3D PUB_KEY_HDR.from_buffer(pub_key)=0D
+=0D
+ key_type =3D next(=0D
+ (key for key,=0D
+ value in PUB_KEY_TYPE.items() if value =3D=3D pub_key_hdr.KeyT=
ype))=0D
+ return '%s%d' % (key_type, (pub_key_hdr.KeySize - 4) * 8)=0D
+=0D
+=0D
+def get_auth_hash_type(key_type, sign_scheme):=0D
+ if key_type =3D=3D "RSA2048" and sign_scheme =3D=3D "RSA_PKCS1":=0D
+ hash_type =3D 'SHA2_256'=0D
+ auth_type =3D 'RSA2048_PKCS1_SHA2_256'=0D
+ elif key_type =3D=3D "RSA3072" and sign_scheme =3D=3D "RSA_PKCS1":=0D
+ hash_type =3D 'SHA2_384'=0D
+ auth_type =3D 'RSA3072_PKCS1_SHA2_384'=0D
+ elif key_type =3D=3D "RSA2048" and sign_scheme =3D=3D "RSA_PSS":=0D
+ hash_type =3D 'SHA2_256'=0D
+ auth_type =3D 'RSA2048_PSS_SHA2_256'=0D
+ elif key_type =3D=3D "RSA3072" and sign_scheme =3D=3D "RSA_PSS":=0D
+ hash_type =3D 'SHA2_384'=0D
+ auth_type =3D 'RSA3072_PSS_SHA2_384'=0D
+ else:=0D
+ hash_type =3D ''=0D
+ auth_type =3D ''=0D
+ return auth_type, hash_type=0D
+=0D
+=0D
+# def single_sign_gen_pub_key(in_key, pub_key_file=3DNone):=0D
+=0D
+=0D
+def gen_pub_key(in_key, pub_key=3DNone):=0D
+=0D
+ keydata =3D single_sign_gen_pub_key(in_key, pub_key)=0D
+=0D
+ publickey =3D PUB_KEY_HDR()=0D
+ publickey.KeySize =3D len(keydata)=0D
+ publickey.KeyType =3D PUB_KEY_TYPE['RSA']=0D
+=0D
+ key =3D bytearray(publickey) + keydata=0D
+=0D
+ if pub_key:=0D
+ gen_file_from_object(pub_key, key)=0D
+=0D
+ return key=0D
+=0D
+=0D
+def decompress(in_file, out_file, tool_dir=3D''):=0D
+ if not os.path.isfile(in_file):=0D
+ raise Exception("Invalid input file '%s' !" % in_file)=0D
+=0D
+ # Remove the Lz Header=0D
+ fi =3D open(in_file, 'rb')=0D
+ di =3D bytearray(fi.read())=0D
+ fi.close()=0D
+=0D
+ lz_hdr =3D LZ_HEADER.from_buffer(di)=0D
+ offset =3D sizeof(lz_hdr)=0D
+ if lz_hdr.signature =3D=3D b"LZDM" or lz_hdr.compressed_len =3D=3D 0:=
=0D
+ fo =3D open(out_file, 'wb')=0D
+ fo.write(di[offset:offset + lz_hdr.compressed_len])=0D
+ fo.close()=0D
+ return=0D
+=0D
+ temp =3D os.path.splitext(out_file)[0] + '.tmp'=0D
+ if lz_hdr.signature =3D=3D b"LZMA":=0D
+ alg =3D "Lzma"=0D
+ elif lz_hdr.signature =3D=3D b"LZ4 ":=0D
+ alg =3D "Lz4"=0D
+ else:=0D
+ raise Exception("Unsupported compression '%s' !" % lz_hdr.signatur=
e)=0D
+=0D
+ fo =3D open(temp, 'wb')=0D
+ fo.write(di[offset:offset + lz_hdr.compressed_len])=0D
+ fo.close()=0D
+=0D
+ compress_tool =3D "%sCompress" % alg=0D
+ if alg =3D=3D "Lz4":=0D
+ try:=0D
+ cmdline =3D [=0D
+ os.path.join(tool_dir, compress_tool),=0D
+ "-d",=0D
+ "-o", out_file,=0D
+ temp]=0D
+ run_process(cmdline, False, True)=0D
+ except Exception:=0D
+ msg_string =3D "Could not find/use CompressLz4 tool, " \=0D
+ "trying with python lz4..."=0D
+ print(msg_string)=0D
+ try:=0D
+ import lz4.block=0D
+ if lz4.VERSION !=3D '3.1.1':=0D
+ msg_string =3D "Recommended lz4 module version " \=0D
+ "is '3.1.1'," + lz4.VERSION \=0D
+ + " is currently installed."=0D
+ print(msg_string)=0D
+ except ImportError:=0D
+ msg_string =3D "Could not import lz4, use " \=0D
+ "'python -m pip install lz4=3D=3D3.1.1' " \=0D
+ "to install it."=0D
+ print(msg_string)=0D
+ exit(1)=0D
+ decompress_data =3D lz4.block.decompress(get_file_data(temp))=
=0D
+ with open(out_file, "wb") as lz4bin:=0D
+ lz4bin.write(decompress_data)=0D
+ else:=0D
+ cmdline =3D [=0D
+ os.path.join(tool_dir, compress_tool),=0D
+ "-d",=0D
+ "-o", out_file,=0D
+ temp]=0D
+ run_process(cmdline, False, True)=0D
+ os.remove(temp)=0D
+=0D
+=0D
+def compress(in_file, alg, svn=3D0, out_path=3D'', tool_dir=3D''):=0D
+ if not os.path.isfile(in_file):=0D
+ raise Exception("Invalid input file '%s' !" % in_file)=0D
+=0D
+ basename, ext =3D os.path.splitext(os.path.basename(in_file))=0D
+ if out_path:=0D
+ if os.path.isdir(out_path):=0D
+ out_file =3D os.path.join(out_path, basename + '.lz')=0D
+ else:=0D
+ out_file =3D os.path.join(out_path)=0D
+ else:=0D
+ out_file =3D os.path.splitext(in_file)[0] + '.lz'=0D
+=0D
+ if alg =3D=3D "Lzma":=0D
+ sig =3D "LZMA"=0D
+ elif alg =3D=3D "Tiano":=0D
+ sig =3D "LZUF"=0D
+ elif alg =3D=3D "Lz4":=0D
+ sig =3D "LZ4 "=0D
+ elif alg =3D=3D "Dummy":=0D
+ sig =3D "LZDM"=0D
+ else:=0D
+ raise Exception("Unsupported compression '%s' !" % alg)=0D
+=0D
+ in_len =3D os.path.getsize(in_file)=0D
+ if in_len > 0:=0D
+ compress_tool =3D "%sCompress" % alg=0D
+ if sig =3D=3D "LZDM":=0D
+ shutil.copy(in_file, out_file)=0D
+ compress_data =3D get_file_data(out_file)=0D
+ elif sig =3D=3D "LZ4 ":=0D
+ try:=0D
+ cmdline =3D [=0D
+ os.path.join(tool_dir, compress_tool),=0D
+ "-e",=0D
+ "-o", out_file,=0D
+ in_file]=0D
+ run_process(cmdline, False, True)=0D
+ compress_data =3D get_file_data(out_file)=0D
+ except Exception:=0D
+ msg_string =3D "Could not find/use CompressLz4 tool, " \=0D
+ "trying with python lz4..."=0D
+ print(msg_string)=0D
+ try:=0D
+ import lz4.block=0D
+ if lz4.VERSION !=3D '3.1.1':=0D
+ msg_string =3D "Recommended lz4 module version " \=
=0D
+ "is '3.1.1', " + lz4.VERSION \=0D
+ + " is currently installed."=0D
+ print(msg_string)=0D
+ except ImportError:=0D
+ msg_string =3D "Could not import lz4, use " \=0D
+ "'python -m pip install lz4=3D=3D3.1.1' " =
\=0D
+ "to install it."=0D
+ print(msg_string)=0D
+ exit(1)=0D
+ compress_data =3D lz4.block.compress(=0D
+ get_file_data(in_file),=0D
+ mode=3D'high_compression')=0D
+ elif sig =3D=3D "LZMA":=0D
+ cmdline =3D [=0D
+ os.path.join(tool_dir, compress_tool),=0D
+ "-e",=0D
+ "-o", out_file,=0D
+ in_file]=0D
+ run_process(cmdline, False, True)=0D
+ compress_data =3D get_file_data(out_file)=0D
+ else:=0D
+ compress_data =3D bytearray()=0D
+=0D
+ lz_hdr =3D LZ_HEADER()=0D
+ lz_hdr.signature =3D sig.encode()=0D
+ lz_hdr.svn =3D svn=0D
+ lz_hdr.compressed_len =3D len(compress_data)=0D
+ lz_hdr.length =3D os.path.getsize(in_file)=0D
+ data =3D bytearray()=0D
+ data.extend(lz_hdr)=0D
+ data.extend(compress_data)=0D
+ gen_file_from_object(out_file, data)=0D
+=0D
+ return out_file=0D
diff --git a/IntelFsp2Pkg/Tools/ConfigEditor/ConfigEditor.py b/IntelFsp2Pkg=
/Tools/ConfigEditor/ConfigEditor.py
new file mode 100644
index 0000000000..a7f79bbc96
--- /dev/null
+++ b/IntelFsp2Pkg/Tools/ConfigEditor/ConfigEditor.py
@@ -0,0 +1,1499 @@
+# @ ConfigEditor.py=0D
+#=0D
+# Copyright(c) 2018 - 2021, Intel Corporation. All rights reserved.<BR>=0D
+# SPDX-License-Identifier: BSD-2-Clause-Patent=0D
+#=0D
+##=0D
+=0D
+import os=0D
+import sys=0D
+import marshal=0D
+import tkinter=0D
+import tkinter.ttk as ttk=0D
+import tkinter.messagebox as messagebox=0D
+import tkinter.filedialog as filedialog=0D
+=0D
+from pathlib import Path=0D
+from GenYamlCfg import CGenYamlCfg, bytes_to_value, \=0D
+ bytes_to_bracket_str, value_to_bytes, array_str_to_value=0D
+from ctypes import sizeof, Structure, ARRAY, c_uint8, c_uint64, c_char, \=
=0D
+ c_uint32, c_uint16=0D
+from functools import reduce=0D
+=0D
+sys.path.insert(0, '..')=0D
+from FspDscBsf2Yaml import bsf_to_dsc, dsc_to_yaml # noqa=0D
+=0D
+=0D
+sys.dont_write_bytecode =3D True=0D
+=0D
+=0D
+class create_tool_tip(object):=0D
+ '''=0D
+ create a tooltip for a given widget=0D
+ '''=0D
+ in_progress =3D False=0D
+=0D
+ def __init__(self, widget, text=3D''):=0D
+ self.top_win =3D None=0D
+ self.widget =3D widget=0D
+ self.text =3D text=0D
+ self.widget.bind("<Enter>", self.enter)=0D
+ self.widget.bind("<Leave>", self.leave)=0D
+=0D
+ def enter(self, event=3DNone):=0D
+ if self.in_progress:=0D
+ return=0D
+ if self.widget.winfo_class() =3D=3D 'Treeview':=0D
+ # Only show help when cursor is on row header.=0D
+ rowid =3D self.widget.identify_row(event.y)=0D
+ if rowid !=3D '':=0D
+ return=0D
+ else:=0D
+ x, y, cx, cy =3D self.widget.bbox("insert")=0D
+=0D
+ cursor =3D self.widget.winfo_pointerxy()=0D
+ x =3D self.widget.winfo_rootx() + 35=0D
+ y =3D self.widget.winfo_rooty() + 20=0D
+ if cursor[1] > y and cursor[1] < y + 20:=0D
+ y +=3D 20=0D
+=0D
+ # creates a toplevel window=0D
+ self.top_win =3D tkinter.Toplevel(self.widget)=0D
+ # Leaves only the label and removes the app window=0D
+ self.top_win.wm_overrideredirect(True)=0D
+ self.top_win.wm_geometry("+%d+%d" % (x, y))=0D
+ label =3D tkinter.Message(self.top_win,=0D
+ text=3Dself.text,=0D
+ justify=3D'left',=0D
+ background=3D'bisque',=0D
+ relief=3D'solid',=0D
+ borderwidth=3D1,=0D
+ font=3D("times", "10", "normal"))=0D
+ label.pack(ipadx=3D1)=0D
+ self.in_progress =3D True=0D
+=0D
+ def leave(self, event=3DNone):=0D
+ if self.top_win:=0D
+ self.top_win.destroy()=0D
+ self.in_progress =3D False=0D
+=0D
+=0D
+class validating_entry(tkinter.Entry):=0D
+ def __init__(self, master, **kw):=0D
+ tkinter.Entry.__init__(*(self, master), **kw)=0D
+ self.parent =3D master=0D
+ self.old_value =3D ''=0D
+ self.last_value =3D ''=0D
+ self.variable =3D tkinter.StringVar()=0D
+ self.variable.trace("w", self.callback)=0D
+ self.config(textvariable=3Dself.variable)=0D
+ self.config({"background": "#c0c0c0"})=0D
+ self.bind("<Return>", self.move_next)=0D
+ self.bind("<Tab>", self.move_next)=0D
+ self.bind("<Escape>", self.cancel)=0D
+ for each in ['BackSpace', 'Delete']:=0D
+ self.bind("<%s>" % each, self.ignore)=0D
+ self.display(None)=0D
+=0D
+ def ignore(self, even):=0D
+ return "break"=0D
+=0D
+ def move_next(self, event):=0D
+ if self.row < 0:=0D
+ return=0D
+ row, col =3D self.row, self.col=0D
+ txt, row_id, col_id =3D self.parent.get_next_cell(row, col)=0D
+ self.display(txt, row_id, col_id)=0D
+ return "break"=0D
+=0D
+ def cancel(self, event):=0D
+ self.variable.set(self.old_value)=0D
+ self.display(None)=0D
+=0D
+ def display(self, txt, row_id=3D'', col_id=3D''):=0D
+ if txt is None:=0D
+ self.row =3D -1=0D
+ self.col =3D -1=0D
+ self.place_forget()=0D
+ else:=0D
+ row =3D int('0x' + row_id[1:], 0) - 1=0D
+ col =3D int(col_id[1:]) - 1=0D
+ self.row =3D row=0D
+ self.col =3D col=0D
+ self.old_value =3D txt=0D
+ self.last_value =3D txt=0D
+ x, y, width, height =3D self.parent.bbox(row_id, col)=0D
+ self.place(x=3Dx, y=3Dy, w=3Dwidth)=0D
+ self.variable.set(txt)=0D
+ self.focus_set()=0D
+ self.icursor(0)=0D
+=0D
+ def callback(self, *Args):=0D
+ cur_val =3D self.variable.get()=0D
+ new_val =3D self.validate(cur_val)=0D
+ if new_val is not None and self.row >=3D 0:=0D
+ self.last_value =3D new_val=0D
+ self.parent.set_cell(self.row, self.col, new_val)=0D
+ self.variable.set(self.last_value)=0D
+=0D
+ def validate(self, value):=0D
+ if len(value) > 0:=0D
+ try:=0D
+ int(value, 16)=0D
+ except Exception:=0D
+ return None=0D
+=0D
+ # Normalize the cell format=0D
+ self.update()=0D
+ cell_width =3D self.winfo_width()=0D
+ max_len =3D custom_table.to_byte_length(cell_width) * 2=0D
+ cur_pos =3D self.index("insert")=0D
+ if cur_pos =3D=3D max_len + 1:=0D
+ value =3D value[-max_len:]=0D
+ else:=0D
+ value =3D value[:max_len]=0D
+ if value =3D=3D '':=0D
+ value =3D '0'=0D
+ fmt =3D '%%0%dX' % max_len=0D
+ return fmt % int(value, 16)=0D
+=0D
+=0D
+class custom_table(ttk.Treeview):=0D
+ _Padding =3D 20=0D
+ _Char_width =3D 6=0D
+=0D
+ def __init__(self, parent, col_hdr, bins):=0D
+ cols =3D len(col_hdr)=0D
+=0D
+ col_byte_len =3D []=0D
+ for col in range(cols): # Columns=0D
+ col_byte_len.append(int(col_hdr[col].split(':')[1]))=0D
+=0D
+ byte_len =3D sum(col_byte_len)=0D
+ rows =3D (len(bins) + byte_len - 1) // byte_len=0D
+=0D
+ self.rows =3D rows=0D
+ self.cols =3D cols=0D
+ self.col_byte_len =3D col_byte_len=0D
+ self.col_hdr =3D col_hdr=0D
+=0D
+ self.size =3D len(bins)=0D
+ self.last_dir =3D ''=0D
+=0D
+ style =3D ttk.Style()=0D
+ style.configure("Custom.Treeview.Heading",=0D
+ font=3D('calibri', 10, 'bold'),=0D
+ foreground=3D"blue")=0D
+ ttk.Treeview.__init__(self, parent, height=3Drows,=0D
+ columns=3D[''] + col_hdr, show=3D'headings',=
=0D
+ style=3D"Custom.Treeview",=0D
+ selectmode=3D'none')=0D
+ self.bind("<Button-1>", self.click)=0D
+ self.bind("<FocusOut>", self.focus_out)=0D
+ self.entry =3D validating_entry(self, width=3D4, justify=3Dtkinte=
r.CENTER)=0D
+=0D
+ self.heading(0, text=3D'LOAD')=0D
+ self.column(0, width=3D60, stretch=3D0, anchor=3Dtkinter.CENTER)=0D
+=0D
+ for col in range(cols): # Columns=0D
+ text =3D col_hdr[col].split(':')[0]=0D
+ byte_len =3D int(col_hdr[col].split(':')[1])=0D
+ self.heading(col+1, text=3Dtext)=0D
+ self.column(col+1, width=3Dself.to_cell_width(byte_len),=0D
+ stretch=3D0, anchor=3Dtkinter.CENTER)=0D
+ idx =3D 0=0D
+ for row in range(rows): # Rows=0D
+ text =3D '%04X' % (row * len(col_hdr))=0D
+ vals =3D ['%04X:' % (cols * row)]=0D
+ for col in range(cols): # Columns=0D
+ if idx >=3D len(bins):=0D
+ break=0D
+ byte_len =3D int(col_hdr[col].split(':')[1])=0D
+ value =3D bytes_to_value(bins[idx:idx+byte_len])=0D
+ hex =3D ("%%0%dX" % (byte_len * 2)) % value=0D
+ vals.append(hex)=0D
+ idx +=3D byte_len=0D
+ self.insert('', 'end', values=3Dtuple(vals))=0D
+ if idx >=3D len(bins):=0D
+ break=0D
+=0D
+ @staticmethod=0D
+ def to_cell_width(byte_len):=0D
+ return byte_len * 2 * custom_table._Char_width + custom_table._Pad=
ding=0D
+=0D
+ @staticmethod=0D
+ def to_byte_length(cell_width):=0D
+ return(cell_width - custom_table._Padding) \=0D
+ // (2 * custom_table._Char_width)=0D
+=0D
+ def focus_out(self, event):=0D
+ self.entry.display(None)=0D
+=0D
+ def refresh_bin(self, bins):=0D
+ if not bins:=0D
+ return=0D
+=0D
+ # Reload binary into widget=0D
+ bin_len =3D len(bins)=0D
+ for row in range(self.rows):=0D
+ iid =3D self.get_children()[row]=0D
+ for col in range(self.cols):=0D
+ idx =3D row * sum(self.col_byte_len) + \=0D
+ sum(self.col_byte_len[:col])=0D
+ byte_len =3D self.col_byte_len[col]=0D
+ if idx + byte_len <=3D self.size:=0D
+ byte_len =3D int(self.col_hdr[col].split(':')[1])=0D
+ if idx + byte_len > bin_len:=0D
+ val =3D 0=0D
+ else:=0D
+ val =3D bytes_to_value(bins[idx:idx+byte_len])=0D
+ hex_val =3D ("%%0%dX" % (byte_len * 2)) % val=0D
+ self.set(iid, col + 1, hex_val)=0D
+=0D
+ def get_cell(self, row, col):=0D
+ iid =3D self.get_children()[row]=0D
+ txt =3D self.item(iid, 'values')[col]=0D
+ return txt=0D
+=0D
+ def get_next_cell(self, row, col):=0D
+ rows =3D self.get_children()=0D
+ col +=3D 1=0D
+ if col > self.cols:=0D
+ col =3D 1=0D
+ row +=3D 1=0D
+ cnt =3D row * sum(self.col_byte_len) + sum(self.col_byte_len[:col]=
)=0D
+ if cnt > self.size:=0D
+ # Reached the last cell, so roll back to beginning=0D
+ row =3D 0=0D
+ col =3D 1=0D
+=0D
+ txt =3D self.get_cell(row, col)=0D
+ row_id =3D rows[row]=0D
+ col_id =3D '#%d' % (col + 1)=0D
+ return(txt, row_id, col_id)=0D
+=0D
+ def set_cell(self, row, col, val):=0D
+ iid =3D self.get_children()[row]=0D
+ self.set(iid, col, val)=0D
+=0D
+ def load_bin(self):=0D
+ # Load binary from file=0D
+ path =3D filedialog.askopenfilename(=0D
+ initialdir=3Dself.last_dir,=0D
+ title=3D"Load binary file",=0D
+ filetypes=3D(("Binary files", "*.bin"), (=0D
+ "binary files", "*.bin")))=0D
+ if path:=0D
+ self.last_dir =3D os.path.dirname(path)=0D
+ fd =3D open(path, 'rb')=0D
+ bins =3D bytearray(fd.read())[:self.size]=0D
+ fd.close()=0D
+ bins.extend(b'\x00' * (self.size - len(bins)))=0D
+ return bins=0D
+=0D
+ return None=0D
+=0D
+ def click(self, event):=0D
+ row_id =3D self.identify_row(event.y)=0D
+ col_id =3D self.identify_column(event.x)=0D
+ if row_id =3D=3D '' and col_id =3D=3D '#1':=0D
+ # Clicked on "LOAD" cell=0D
+ bins =3D self.load_bin()=0D
+ self.refresh_bin(bins)=0D
+ return=0D
+=0D
+ if col_id =3D=3D '#1':=0D
+ # Clicked on column 1(Offset column)=0D
+ return=0D
+=0D
+ item =3D self.identify('item', event.x, event.y)=0D
+ if not item or not col_id:=0D
+ # Not clicked on valid cell=0D
+ return=0D
+=0D
+ # Clicked cell=0D
+ row =3D int('0x' + row_id[1:], 0) - 1=0D
+ col =3D int(col_id[1:]) - 1=0D
+ if row * self.cols + col > self.size:=0D
+ return=0D
+=0D
+ vals =3D self.item(item, 'values')=0D
+ if col < len(vals):=0D
+ txt =3D self.item(item, 'values')[col]=0D
+ self.entry.display(txt, row_id, col_id)=0D
+=0D
+ def get(self):=0D
+ bins =3D bytearray()=0D
+ row_ids =3D self.get_children()=0D
+ for row_id in row_ids:=0D
+ row =3D int('0x' + row_id[1:], 0) - 1=0D
+ for col in range(self.cols):=0D
+ idx =3D row * sum(self.col_byte_len) + \=0D
+ sum(self.col_byte_len[:col])=0D
+ byte_len =3D self.col_byte_len[col]=0D
+ if idx + byte_len > self.size:=0D
+ break=0D
+ hex =3D self.item(row_id, 'values')[col + 1]=0D
+ values =3D value_to_bytes(int(hex, 16)=0D
+ & ((1 << byte_len * 8) - 1), byte_=
len)=0D
+ bins.extend(values)=0D
+ return bins=0D
+=0D
+=0D
+class c_uint24(Structure):=0D
+ """Little-Endian 24-bit Unsigned Integer"""=0D
+ _pack_ =3D 1=0D
+ _fields_ =3D [('Data', (c_uint8 * 3))]=0D
+=0D
+ def __init__(self, val=3D0):=0D
+ self.set_value(val)=0D
+=0D
+ def __str__(self, indent=3D0):=0D
+ return '0x%.6x' % self.value=0D
+=0D
+ def __int__(self):=0D
+ return self.get_value()=0D
+=0D
+ def set_value(self, val):=0D
+ self.Data[0:3] =3D Val2Bytes(val, 3)=0D
+=0D
+ def get_value(self):=0D
+ return Bytes2Val(self.Data[0:3])=0D
+=0D
+ value =3D property(get_value, set_value)=0D
+=0D
+=0D
+class EFI_FIRMWARE_VOLUME_HEADER(Structure):=0D
+ _fields_ =3D [=0D
+ ('ZeroVector', ARRAY(c_uint8, 16)),=0D
+ ('FileSystemGuid', ARRAY(c_uint8, 16)),=0D
+ ('FvLength', c_uint64),=0D
+ ('Signature', ARRAY(c_char, 4)),=0D
+ ('Attributes', c_uint32),=0D
+ ('HeaderLength', c_uint16),=0D
+ ('Checksum', c_uint16),=0D
+ ('ExtHeaderOffset', c_uint16),=0D
+ ('Reserved', c_uint8),=0D
+ ('Revision', c_uint8)=0D
+ ]=0D
+=0D
+=0D
+class EFI_FIRMWARE_VOLUME_EXT_HEADER(Structure):=0D
+ _fields_ =3D [=0D
+ ('FvName', ARRAY(c_uint8, 16)),=0D
+ ('ExtHeaderSize', c_uint32)=0D
+ ]=0D
+=0D
+=0D
+class EFI_FFS_INTEGRITY_CHECK(Structure):=0D
+ _fields_ =3D [=0D
+ ('Header', c_uint8),=0D
+ ('File', c_uint8)=0D
+ ]=0D
+=0D
+=0D
+class EFI_FFS_FILE_HEADER(Structure):=0D
+ _fields_ =3D [=0D
+ ('Name', ARRAY(c_uint8, 16)),=0D
+ ('IntegrityCheck', EFI_FFS_INTEGRITY_CHECK),=0D
+ ('Type', c_uint8),=0D
+ ('Attributes', c_uint8),=0D
+ ('Size', c_uint24),=0D
+ ('State', c_uint8)=0D
+ ]=0D
+=0D
+=0D
+class EFI_COMMON_SECTION_HEADER(Structure):=0D
+ _fields_ =3D [=0D
+ ('Size', c_uint24),=0D
+ ('Type', c_uint8)=0D
+ ]=0D
+=0D
+=0D
+class EFI_SECTION_TYPE:=0D
+ """Enumeration of all valid firmware file section types."""=0D
+ ALL =3D 0x00=0D
+ COMPRESSION =3D 0x01=0D
+ GUID_DEFINED =3D 0x02=0D
+ DISPOSABLE =3D 0x03=0D
+ PE32 =3D 0x10=0D
+ PIC =3D 0x11=0D
+ TE =3D 0x12=0D
+ DXE_DEPEX =3D 0x13=0D
+ VERSION =3D 0x14=0D
+ USER_INTERFACE =3D 0x15=0D
+ COMPATIBILITY16 =3D 0x16=0D
+ FIRMWARE_VOLUME_IMAGE =3D 0x17=0D
+ FREEFORM_SUBTYPE_GUID =3D 0x18=0D
+ RAW =3D 0x19=0D
+ PEI_DEPEX =3D 0x1b=0D
+ SMM_DEPEX =3D 0x1c=0D
+=0D
+=0D
+class FSP_COMMON_HEADER(Structure):=0D
+ _fields_ =3D [=0D
+ ('Signature', ARRAY(c_char, 4)),=0D
+ ('HeaderLength', c_uint32)=0D
+ ]=0D
+=0D
+=0D
+class FSP_INFORMATION_HEADER(Structure):=0D
+ _fields_ =3D [=0D
+ ('Signature', ARRAY(c_char, 4)),=0D
+ ('HeaderLength', c_uint32),=0D
+ ('Reserved1', c_uint16),=0D
+ ('SpecVersion', c_uint8),=0D
+ ('HeaderRevision', c_uint8),=0D
+ ('ImageRevision', c_uint32),=0D
+ ('ImageId', ARRAY(c_char, 8)),=0D
+ ('ImageSize', c_uint32),=0D
+ ('ImageBase', c_uint32),=0D
+ ('ImageAttribute', c_uint16),=0D
+ ('ComponentAttribute', c_uint16),=0D
+ ('CfgRegionOffset', c_uint32),=0D
+ ('CfgRegionSize', c_uint32),=0D
+ ('Reserved2', c_uint32),=0D
+ ('TempRamInitEntryOffset', c_uint32),=0D
+ ('Reserved3', c_uint32),=0D
+ ('NotifyPhaseEntryOffset', c_uint32),=0D
+ ('FspMemoryInitEntryOffset', c_uint32),=0D
+ ('TempRamExitEntryOffset', c_uint32),=0D
+ ('FspSiliconInitEntryOffset', c_uint32)=0D
+ ]=0D
+=0D
+=0D
+class FSP_EXTENDED_HEADER(Structure):=0D
+ _fields_ =3D [=0D
+ ('Signature', ARRAY(c_char, 4)),=0D
+ ('HeaderLength', c_uint32),=0D
+ ('Revision', c_uint8),=0D
+ ('Reserved', c_uint8),=0D
+ ('FspProducerId', ARRAY(c_char, 6)),=0D
+ ('FspProducerRevision', c_uint32),=0D
+ ('FspProducerDataSize', c_uint32)=0D
+ ]=0D
+=0D
+=0D
+class FSP_PATCH_TABLE(Structure):=0D
+ _fields_ =3D [=0D
+ ('Signature', ARRAY(c_char, 4)),=0D
+ ('HeaderLength', c_uint16),=0D
+ ('HeaderRevision', c_uint8),=0D
+ ('Reserved', c_uint8),=0D
+ ('PatchEntryNum', c_uint32)=0D
+ ]=0D
+=0D
+=0D
+class Section:=0D
+ def __init__(self, offset, secdata):=0D
+ self.SecHdr =3D EFI_COMMON_SECTION_HEADER.from_buffer(secdata, 0)=
=0D
+ self.SecData =3D secdata[0:int(self.SecHdr.Size)]=0D
+ self.Offset =3D offset=0D
+=0D
+=0D
+def AlignPtr(offset, alignment=3D8):=0D
+ return (offset + alignment - 1) & ~(alignment - 1)=0D
+=0D
+=0D
+def Bytes2Val(bytes):=0D
+ return reduce(lambda x, y: (x << 8) | y, bytes[:: -1])=0D
+=0D
+=0D
+def Val2Bytes(value, blen):=0D
+ return [(value >> (i*8) & 0xff) for i in range(blen)]=0D
+=0D
+=0D
+class FirmwareFile:=0D
+ def __init__(self, offset, filedata):=0D
+ self.FfsHdr =3D EFI_FFS_FILE_HEADER.from_buffer(filedata, 0)=0D
+ self.FfsData =3D filedata[0:int(self.FfsHdr.Size)]=0D
+ self.Offset =3D offset=0D
+ self.SecList =3D []=0D
+=0D
+ def ParseFfs(self):=0D
+ ffssize =3D len(self.FfsData)=0D
+ offset =3D sizeof(self.FfsHdr)=0D
+ if self.FfsHdr.Name !=3D '\xff' * 16:=0D
+ while offset < (ffssize - sizeof(EFI_COMMON_SECTION_HEADER)):=
=0D
+ sechdr =3D EFI_COMMON_SECTION_HEADER.from_buffer(=0D
+ self.FfsData, offset)=0D
+ sec =3D Section(=0D
+ offset, self.FfsData[offset:offset + int(sechdr.Size)]=
)=0D
+ self.SecList.append(sec)=0D
+ offset +=3D int(sechdr.Size)=0D
+ offset =3D AlignPtr(offset, 4)=0D
+=0D
+=0D
+class FirmwareVolume:=0D
+ def __init__(self, offset, fvdata):=0D
+ self.FvHdr =3D EFI_FIRMWARE_VOLUME_HEADER.from_buffer(fvdata, 0)=0D
+ self.FvData =3D fvdata[0: self.FvHdr.FvLength]=0D
+ self.Offset =3D offset=0D
+ if self.FvHdr.ExtHeaderOffset > 0:=0D
+ self.FvExtHdr =3D EFI_FIRMWARE_VOLUME_EXT_HEADER.from_buffer(=
=0D
+ self.FvData, self.FvHdr.ExtHeaderOffset)=0D
+ else:=0D
+ self.FvExtHdr =3D None=0D
+ self.FfsList =3D []=0D
+=0D
+ def ParseFv(self):=0D
+ fvsize =3D len(self.FvData)=0D
+ if self.FvExtHdr:=0D
+ offset =3D self.FvHdr.ExtHeaderOffset + self.FvExtHdr.ExtHeade=
rSize=0D
+ else:=0D
+ offset =3D self.FvHdr.HeaderLength=0D
+ offset =3D AlignPtr(offset)=0D
+ while offset < (fvsize - sizeof(EFI_FFS_FILE_HEADER)):=0D
+ ffshdr =3D EFI_FFS_FILE_HEADER.from_buffer(self.FvData, offset=
)=0D
+ if (ffshdr.Name =3D=3D '\xff' * 16) and \=0D
+ (int(ffshdr.Size) =3D=3D 0xFFFFFF):=0D
+ offset =3D fvsize=0D
+ else:=0D
+ ffs =3D FirmwareFile(=0D
+ offset, self.FvData[offset:offset + int(ffshdr.Size)])=
=0D
+ ffs.ParseFfs()=0D
+ self.FfsList.append(ffs)=0D
+ offset +=3D int(ffshdr.Size)=0D
+ offset =3D AlignPtr(offset)=0D
+=0D
+=0D
+class FspImage:=0D
+ def __init__(self, offset, fih, fihoff, patch):=0D
+ self.Fih =3D fih=0D
+ self.FihOffset =3D fihoff=0D
+ self.Offset =3D offset=0D
+ self.FvIdxList =3D []=0D
+ self.Type =3D "XTMSXXXXOXXXXXXX"[(fih.ComponentAttribute >> 12) & =
0x0F]=0D
+ self.PatchList =3D patch=0D
+ self.PatchList.append(fihoff + 0x1C)=0D
+=0D
+ def AppendFv(self, FvIdx):=0D
+ self.FvIdxList.append(FvIdx)=0D
+=0D
+ def Patch(self, delta, fdbin):=0D
+ count =3D 0=0D
+ applied =3D 0=0D
+ for idx, patch in enumerate(self.PatchList):=0D
+ ptype =3D (patch >> 24) & 0x0F=0D
+ if ptype not in [0x00, 0x0F]:=0D
+ raise Exception('ERROR: Invalid patch type %d !' % ptype)=
=0D
+ if patch & 0x80000000:=0D
+ patch =3D self.Fih.ImageSize - (0x1000000 - (patch & 0xFFF=
FFF))=0D
+ else:=0D
+ patch =3D patch & 0xFFFFFF=0D
+ if (patch < self.Fih.ImageSize) and \=0D
+ (patch + sizeof(c_uint32) <=3D self.Fih.ImageSize):=0D
+ offset =3D patch + self.Offset=0D
+ value =3D Bytes2Val(fdbin[offset:offset+sizeof(c_uint32)])=
=0D
+ value +=3D delta=0D
+ fdbin[offset:offset+sizeof(c_uint32)] =3D Val2Bytes(=0D
+ value, sizeof(c_uint32))=0D
+ applied +=3D 1=0D
+ count +=3D 1=0D
+ # Don't count the FSP base address patch entry appended at the end=
=0D
+ if count !=3D 0:=0D
+ count -=3D 1=0D
+ applied -=3D 1=0D
+ return (count, applied)=0D
+=0D
+=0D
+class FirmwareDevice:=0D
+ def __init__(self, offset, FdData):=0D
+ self.FvList =3D []=0D
+ self.FspList =3D []=0D
+ self.FspExtList =3D []=0D
+ self.FihList =3D []=0D
+ self.BuildList =3D []=0D
+ self.OutputText =3D ""=0D
+ self.Offset =3D 0=0D
+ self.FdData =3D FdData=0D
+=0D
+ def ParseFd(self):=0D
+ offset =3D 0=0D
+ fdsize =3D len(self.FdData)=0D
+ self.FvList =3D []=0D
+ while offset < (fdsize - sizeof(EFI_FIRMWARE_VOLUME_HEADER)):=0D
+ fvh =3D EFI_FIRMWARE_VOLUME_HEADER.from_buffer(self.FdData, of=
fset)=0D
+ if b'_FVH' !=3D fvh.Signature:=0D
+ raise Exception("ERROR: Invalid FV header !")=0D
+ fv =3D FirmwareVolume(=0D
+ offset, self.FdData[offset:offset + fvh.FvLength])=0D
+ fv.ParseFv()=0D
+ self.FvList.append(fv)=0D
+ offset +=3D fv.FvHdr.FvLength=0D
+=0D
+ def CheckFsp(self):=0D
+ if len(self.FspList) =3D=3D 0:=0D
+ return=0D
+=0D
+ fih =3D None=0D
+ for fsp in self.FspList:=0D
+ if not fih:=0D
+ fih =3D fsp.Fih=0D
+ else:=0D
+ newfih =3D fsp.Fih=0D
+ if (newfih.ImageId !=3D fih.ImageId) or \=0D
+ (newfih.ImageRevision !=3D fih.ImageRevision):=0D
+ raise Exception(=0D
+ "ERROR: Inconsistent FSP ImageId or "=0D
+ "ImageRevision detected !")=0D
+=0D
+ def ParseFsp(self):=0D
+ flen =3D 0=0D
+ for idx, fv in enumerate(self.FvList):=0D
+ # Check if this FV contains FSP header=0D
+ if flen =3D=3D 0:=0D
+ if len(fv.FfsList) =3D=3D 0:=0D
+ continue=0D
+ ffs =3D fv.FfsList[0]=0D
+ if len(ffs.SecList) =3D=3D 0:=0D
+ continue=0D
+ sec =3D ffs.SecList[0]=0D
+ if sec.SecHdr.Type !=3D EFI_SECTION_TYPE.RAW:=0D
+ continue=0D
+ fihoffset =3D ffs.Offset + sec.Offset + sizeof(sec.SecHdr)=
=0D
+ fspoffset =3D fv.Offset=0D
+ offset =3D fspoffset + fihoffset=0D
+ fih =3D FSP_INFORMATION_HEADER.from_buffer(self.FdData, of=
fset)=0D
+ self.FihList.append(fih)=0D
+ if b'FSPH' !=3D fih.Signature:=0D
+ continue=0D
+=0D
+ offset +=3D fih.HeaderLength=0D
+=0D
+ offset =3D AlignPtr(offset, 2)=0D
+ Extfih =3D FSP_EXTENDED_HEADER.from_buffer(self.FdData, of=
fset)=0D
+ self.FspExtList.append(Extfih)=0D
+ offset =3D AlignPtr(offset, 4)=0D
+ plist =3D []=0D
+ while True:=0D
+ fch =3D FSP_COMMON_HEADER.from_buffer(self.FdData, off=
set)=0D
+ if b'FSPP' !=3D fch.Signature:=0D
+ offset +=3D fch.HeaderLength=0D
+ offset =3D AlignPtr(offset, 4)=0D
+ else:=0D
+ fspp =3D FSP_PATCH_TABLE.from_buffer(=0D
+ self.FdData, offset)=0D
+ offset +=3D sizeof(fspp)=0D
+ start_offset =3D offset + 32=0D
+ end_offset =3D offset + 32=0D
+ while True:=0D
+ end_offset +=3D 1=0D
+ if(self.FdData[=0D
+ end_offset: end_offset + 1] =3D=3D b'\=
xff'):=0D
+ break=0D
+ self.BuildList.append(=0D
+ self.FdData[start_offset:end_offset])=0D
+ pdata =3D (c_uint32 * fspp.PatchEntryNum).from_buf=
fer(=0D
+ self.FdData, offset)=0D
+ plist =3D list(pdata)=0D
+ break=0D
+=0D
+ fsp =3D FspImage(fspoffset, fih, fihoffset, plist)=0D
+ fsp.AppendFv(idx)=0D
+ self.FspList.append(fsp)=0D
+ flen =3D fsp.Fih.ImageSize - fv.FvHdr.FvLength=0D
+ else:=0D
+ fsp.AppendFv(idx)=0D
+ flen -=3D fv.FvHdr.FvLength=0D
+ if flen < 0:=0D
+ raise Exception("ERROR: Incorrect FV size in image !")=
=0D
+ self.CheckFsp()=0D
+=0D
+ def OutputFsp(self):=0D
+ def copy_text_to_clipboard():=0D
+ window.clipboard_clear()=0D
+ window.clipboard_append(self.OutputText)=0D
+=0D
+ window =3D tkinter.Tk()=0D
+ window.title("Fsp Headers")=0D
+ window.resizable(0, 0)=0D
+ # Window Size=0D
+ window.geometry("300x400+350+150")=0D
+ frame =3D tkinter.Frame(window)=0D
+ frame.pack(side=3Dtkinter.BOTTOM)=0D
+ # Vertical (y) Scroll Bar=0D
+ scroll =3D tkinter.Scrollbar(window)=0D
+ scroll.pack(side=3Dtkinter.RIGHT, fill=3Dtkinter.Y)=0D
+ text =3D tkinter.Text(window,=0D
+ wrap=3Dtkinter.NONE, yscrollcommand=3Dscroll.s=
et)=0D
+ i =3D 0=0D
+ self.OutputText =3D self.OutputText + "Fsp Header Details \n\n"=0D
+ while i < len(self.FihList):=0D
+ try:=0D
+ self.OutputText +=3D str(self.BuildList[i].decode()) + "\n=
"=0D
+ except Exception:=0D
+ self.OutputText +=3D "No description found\n"=0D
+ self.OutputText +=3D "FSP Header :\n "=0D
+ self.OutputText +=3D "Signature : " + \=0D
+ str(self.FihList[i].Signature.decode('utf-8')) + "\n "=0D
+ self.OutputText +=3D "Header Length : " + \=0D
+ str(hex(self.FihList[i].HeaderLength)) + "\n "=0D
+ self.OutputText +=3D "Header Revision : " + \=0D
+ str(hex(self.FihList[i].HeaderRevision)) + "\n "=0D
+ self.OutputText +=3D "Spec Version : " + \=0D
+ str(hex(self.FihList[i].SpecVersion)) + "\n "=0D
+ self.OutputText +=3D "Image Revision : " + \=0D
+ str(hex(self.FihList[i].ImageRevision)) + "\n "=0D
+ self.OutputText +=3D "Image Id : " + \=0D
+ str(self.FihList[i].ImageId.decode('utf-8')) + "\n "=0D
+ self.OutputText +=3D "Image Size : " + \=0D
+ str(hex(self.FihList[i].ImageSize)) + "\n "=0D
+ self.OutputText +=3D "Image Base : " + \=0D
+ str(hex(self.FihList[i].ImageBase)) + "\n "=0D
+ self.OutputText +=3D "Image Attribute : " + \=0D
+ str(hex(self.FihList[i].ImageAttribute)) + "\n "=0D
+ self.OutputText +=3D "Cfg Region Offset : " + \=0D
+ str(hex(self.FihList[i].CfgRegionOffset)) + "\n "=0D
+ self.OutputText +=3D "Cfg Region Size : " + \=0D
+ str(hex(self.FihList[i].CfgRegionSize)) + "\n "=0D
+ self.OutputText +=3D "API Entry Num : " + \=0D
+ str(hex(self.FihList[i].Reserved2)) + "\n "=0D
+ self.OutputText +=3D "Temp Ram Init Entry : " + \=0D
+ str(hex(self.FihList[i].TempRamInitEntryOffset)) + "\n "=0D
+ self.OutputText +=3D "FSP Init Entry : " + \=0D
+ str(hex(self.FihList[i].Reserved3)) + "\n "=0D
+ self.OutputText +=3D "Notify Phase Entry : " + \=0D
+ str(hex(self.FihList[i].NotifyPhaseEntryOffset)) + "\n "=0D
+ self.OutputText +=3D "Fsp Memory Init Entry : " + \=0D
+ str(hex(self.FihList[i].FspMemoryInitEntryOffset)) + "\n "=
=0D
+ self.OutputText +=3D "Temp Ram Exit Entry : " + \=0D
+ str(hex(self.FihList[i].TempRamExitEntryOffset)) + "\n "=0D
+ self.OutputText +=3D "Fsp Silicon Init Entry : " + \=0D
+ str(hex(self.FihList[i].FspSiliconInitEntryOffset)) + "\n\=
n"=0D
+ self.OutputText +=3D "FSP Extended Header:\n "=0D
+ self.OutputText +=3D "Signature : " + \=0D
+ str(self.FspExtList[i].Signature.decode('utf-8')) + "\n "=
=0D
+ self.OutputText +=3D "Header Length : " + \=0D
+ str(hex(self.FspExtList[i].HeaderLength)) + "\n "=0D
+ self.OutputText +=3D "Header Revision : " + \=0D
+ str(hex(self.FspExtList[i].Revision)) + "\n "=0D
+ self.OutputText +=3D "Fsp Producer Id : " + \=0D
+ str(self.FspExtList[i].FspProducerId.decode('utf-8')) + "\=
n "=0D
+ self.OutputText +=3D "FspProducerRevision : " + \=0D
+ str(hex(self.FspExtList[i].FspProducerRevision)) + "\n\n"=
=0D
+ i +=3D 1=0D
+ text.insert(tkinter.INSERT, self.OutputText)=0D
+ text.pack()=0D
+ # Configure the scrollbars=0D
+ scroll.config(command=3Dtext.yview)=0D
+ copy_button =3D tkinter.Button(=0D
+ window, text=3D"Copy to Clipboard", command=3Dcopy_text_to_cli=
pboard)=0D
+ copy_button.pack(in_=3Dframe, side=3Dtkinter.LEFT, padx=3D20, pady=
=3D10)=0D
+ exit_button =3D tkinter.Button(=0D
+ window, text=3D"Close", command=3Dwindow.destroy)=0D
+ exit_button.pack(in_=3Dframe, side=3Dtkinter.RIGHT, padx=3D20, pad=
y=3D10)=0D
+ window.mainloop()=0D
+=0D
+=0D
+class state:=0D
+ def __init__(self):=0D
+ self.state =3D False=0D
+=0D
+ def set(self, value):=0D
+ self.state =3D value=0D
+=0D
+ def get(self):=0D
+ return self.state=0D
+=0D
+=0D
+class application(tkinter.Frame):=0D
+ def __init__(self, master=3DNone):=0D
+ root =3D master=0D
+=0D
+ self.debug =3D True=0D
+ self.mode =3D 'FSP'=0D
+ self.last_dir =3D '.'=0D
+ self.page_id =3D ''=0D
+ self.page_list =3D {}=0D
+ self.conf_list =3D {}=0D
+ self.cfg_data_obj =3D None=0D
+ self.org_cfg_data_bin =3D None=0D
+ self.in_left =3D state()=0D
+ self.in_right =3D state()=0D
+=0D
+ # Check if current directory contains a file with a .yaml extensio=
n=0D
+ # if not default self.last_dir to a Platform directory where it is=
=0D
+ # easier to locate *BoardPkg\CfgData\*Def.yaml files=0D
+ self.last_dir =3D '.'=0D
+ if not any(fname.endswith('.yaml') for fname in os.listdir('.')):=
=0D
+ platform_path =3D Path(os.path.realpath(__file__)).parents[2].=
\=0D
+ joinpath('Platform')=0D
+ if platform_path.exists():=0D
+ self.last_dir =3D platform_path=0D
+=0D
+ tkinter.Frame.__init__(self, master, borderwidth=3D2)=0D
+=0D
+ self.menu_string =3D [=0D
+ 'Save Config Data to Binary', 'Load Config Data from Binary',=
=0D
+ 'Show Binary Information',=0D
+ 'Load Config Changes from Delta File',=0D
+ 'Save Config Changes to Delta File',=0D
+ 'Save Full Config Data to Delta File',=0D
+ 'Open Config BSF file'=0D
+ ]=0D
+=0D
+ root.geometry("1200x800")=0D
+=0D
+ paned =3D ttk.Panedwindow(root, orient=3Dtkinter.HORIZONTAL)=0D
+ paned.pack(fill=3Dtkinter.BOTH, expand=3DTrue, padx=3D(4, 4))=0D
+=0D
+ status =3D tkinter.Label(master, text=3D"", bd=3D1, relief=3Dtkint=
er.SUNKEN,=0D
+ anchor=3Dtkinter.W)=0D
+ status.pack(side=3Dtkinter.BOTTOM, fill=3Dtkinter.X)=0D
+=0D
+ frame_left =3D ttk.Frame(paned, height=3D800, relief=3D"groove")=0D
+=0D
+ self.left =3D ttk.Treeview(frame_left, show=3D"tree")=0D
+=0D
+ # Set up tree HScroller=0D
+ pady =3D (10, 10)=0D
+ self.tree_scroll =3D ttk.Scrollbar(frame_left,=0D
+ orient=3D"vertical",=0D
+ command=3Dself.left.yview)=0D
+ self.left.configure(yscrollcommand=3Dself.tree_scroll.set)=0D
+ self.left.bind("<<TreeviewSelect>>", self.on_config_page_select_ch=
ange)=0D
+ self.left.bind("<Enter>", lambda e: self.in_left.set(True))=0D
+ self.left.bind("<Leave>", lambda e: self.in_left.set(False))=0D
+ self.left.bind("<MouseWheel>", self.on_tree_scroll)=0D
+=0D
+ self.left.pack(side=3D'left',=0D
+ fill=3Dtkinter.BOTH,=0D
+ expand=3DTrue,=0D
+ padx=3D(5, 0),=0D
+ pady=3Dpady)=0D
+ self.tree_scroll.pack(side=3D'right', fill=3Dtkinter.Y,=0D
+ pady=3Dpady, padx=3D(0, 5))=0D
+=0D
+ frame_right =3D ttk.Frame(paned, relief=3D"groove")=0D
+ self.frame_right =3D frame_right=0D
+=0D
+ self.conf_canvas =3D tkinter.Canvas(frame_right, highlightthicknes=
s=3D0)=0D
+ self.page_scroll =3D ttk.Scrollbar(frame_right,=0D
+ orient=3D"vertical",=0D
+ command=3Dself.conf_canvas.yview)=
=0D
+ self.right_grid =3D ttk.Frame(self.conf_canvas)=0D
+ self.conf_canvas.configure(yscrollcommand=3Dself.page_scroll.set)=
=0D
+ self.conf_canvas.pack(side=3D'left',=0D
+ fill=3Dtkinter.BOTH,=0D
+ expand=3DTrue,=0D
+ pady=3Dpady,=0D
+ padx=3D(5, 0))=0D
+ self.page_scroll.pack(side=3D'right', fill=3Dtkinter.Y,=0D
+ pady=3Dpady, padx=3D(0, 5))=0D
+ self.conf_canvas.create_window(0, 0, window=3Dself.right_grid,=0D
+ anchor=3D'nw')=0D
+ self.conf_canvas.bind('<Enter>', lambda e: self.in_right.set(True)=
)=0D
+ self.conf_canvas.bind('<Leave>', lambda e: self.in_right.set(False=
))=0D
+ self.conf_canvas.bind("<Configure>", self.on_canvas_configure)=0D
+ self.conf_canvas.bind_all("<MouseWheel>", self.on_page_scroll)=0D
+=0D
+ paned.add(frame_left, weight=3D2)=0D
+ paned.add(frame_right, weight=3D10)=0D
+=0D
+ style =3D ttk.Style()=0D
+ style.layout("Treeview", [('Treeview.treearea', {'sticky': 'nswe'}=
)])=0D
+=0D
+ menubar =3D tkinter.Menu(root)=0D
+ file_menu =3D tkinter.Menu(menubar, tearoff=3D0)=0D
+ file_menu.add_command(label=3D"Open Config YAML file",=0D
+ command=3Dself.load_from_yaml)=0D
+ file_menu.add_command(label=3Dself.menu_string[6],=0D
+ command=3Dself.load_from_bsf_file)=0D
+ file_menu.add_command(label=3Dself.menu_string[2],=0D
+ command=3Dself.load_from_fd)=0D
+ file_menu.add_command(label=3Dself.menu_string[0],=0D
+ command=3Dself.save_to_bin,=0D
+ state=3D'disabled')=0D
+ file_menu.add_command(label=3Dself.menu_string[1],=0D
+ command=3Dself.load_from_bin,=0D
+ state=3D'disabled')=0D
+ file_menu.add_command(label=3Dself.menu_string[3],=0D
+ command=3Dself.load_from_delta,=0D
+ state=3D'disabled')=0D
+ file_menu.add_command(label=3Dself.menu_string[4],=0D
+ command=3Dself.save_to_delta,=0D
+ state=3D'disabled')=0D
+ file_menu.add_command(label=3Dself.menu_string[5],=0D
+ command=3Dself.save_full_to_delta,=0D
+ state=3D'disabled')=0D
+ file_menu.add_command(label=3D"About", command=3Dself.about)=0D
+ menubar.add_cascade(label=3D"File", menu=3Dfile_menu)=0D
+ self.file_menu =3D file_menu=0D
+=0D
+ root.config(menu=3Dmenubar)=0D
+=0D
+ if len(sys.argv) > 1:=0D
+ path =3D sys.argv[1]=0D
+ if not path.endswith('.yaml') and not path.endswith('.pkl'):=0D
+ messagebox.showerror('LOADING ERROR',=0D
+ "Unsupported file '%s' !" % path)=0D
+ return=0D
+ else:=0D
+ self.load_cfg_file(path)=0D
+=0D
+ if len(sys.argv) > 2:=0D
+ path =3D sys.argv[2]=0D
+ if path.endswith('.dlt'):=0D
+ self.load_delta_file(path)=0D
+ elif path.endswith('.bin'):=0D
+ self.load_bin_file(path)=0D
+ else:=0D
+ messagebox.showerror('LOADING ERROR',=0D
+ "Unsupported file '%s' !" % path)=0D
+ return=0D
+=0D
+ def set_object_name(self, widget, name):=0D
+ self.conf_list[id(widget)] =3D name=0D
+=0D
+ def get_object_name(self, widget):=0D
+ if id(widget) in self.conf_list:=0D
+ return self.conf_list[id(widget)]=0D
+ else:=0D
+ return None=0D
+=0D
+ def limit_entry_size(self, variable, limit):=0D
+ value =3D variable.get()=0D
+ if len(value) > limit:=0D
+ variable.set(value[:limit])=0D
+=0D
+ def on_canvas_configure(self, event):=0D
+ self.right_grid.grid_columnconfigure(0, minsize=3Devent.width)=0D
+=0D
+ def on_tree_scroll(self, event):=0D
+ if not self.in_left.get() and self.in_right.get():=0D
+ # This prevents scroll event from being handled by both left a=
nd=0D
+ # right frame at the same time.=0D
+ self.on_page_scroll(event)=0D
+ return 'break'=0D
+=0D
+ def on_page_scroll(self, event):=0D
+ if self.in_right.get():=0D
+ # Only scroll when it is in active area=0D
+ min, max =3D self.page_scroll.get()=0D
+ if not((min =3D=3D 0.0) and (max =3D=3D 1.0)):=0D
+ self.conf_canvas.yview_scroll(-1 * int(event.delta / 120),=
=0D
+ 'units')=0D
+=0D
+ def update_visibility_for_widget(self, widget, args):=0D
+=0D
+ visible =3D True=0D
+ item =3D self.get_config_data_item_from_widget(widget, True)=0D
+ if item is None:=0D
+ return visible=0D
+ elif not item:=0D
+ return visible=0D
+=0D
+ result =3D 1=0D
+ if item['condition']:=0D
+ result =3D self.evaluate_condition(item)=0D
+ if result =3D=3D 2:=0D
+ # Gray=0D
+ widget.configure(state=3D'disabled')=0D
+ elif result =3D=3D 0:=0D
+ # Hide=0D
+ visible =3D False=0D
+ widget.grid_remove()=0D
+ else:=0D
+ # Show=0D
+ widget.grid()=0D
+ widget.configure(state=3D'normal')=0D
+=0D
+ return visible=0D
+=0D
+ def update_widgets_visibility_on_page(self):=0D
+ self.walk_widgets_in_layout(self.right_grid,=0D
+ self.update_visibility_for_widget)=0D
+=0D
+ def combo_select_changed(self, event):=0D
+ self.update_config_data_from_widget(event.widget, None)=0D
+ self.update_widgets_visibility_on_page()=0D
+=0D
+ def edit_num_finished(self, event):=0D
+ widget =3D event.widget=0D
+ item =3D self.get_config_data_item_from_widget(widget)=0D
+ if not item:=0D
+ return=0D
+ parts =3D item['type'].split(',')=0D
+ if len(parts) > 3:=0D
+ min =3D parts[2].lstrip()[1:]=0D
+ max =3D parts[3].rstrip()[:-1]=0D
+ min_val =3D array_str_to_value(min)=0D
+ max_val =3D array_str_to_value(max)=0D
+ text =3D widget.get()=0D
+ if ',' in text:=0D
+ text =3D '{ %s }' % text=0D
+ try:=0D
+ value =3D array_str_to_value(text)=0D
+ if value < min_val or value > max_val:=0D
+ raise Exception('Invalid input!')=0D
+ self.set_config_item_value(item, text)=0D
+ except Exception:=0D
+ pass=0D
+=0D
+ text =3D item['value'].strip('{').strip('}').strip()=0D
+ widget.delete(0, tkinter.END)=0D
+ widget.insert(0, text)=0D
+=0D
+ self.update_widgets_visibility_on_page()=0D
+=0D
+ def update_page_scroll_bar(self):=0D
+ # Update scrollbar=0D
+ self.frame_right.update()=0D
+ self.conf_canvas.config(scrollregion=3Dself.conf_canvas.bbox("all"=
))=0D
+=0D
+ def on_config_page_select_change(self, event):=0D
+ self.update_config_data_on_page()=0D
+ sel =3D self.left.selection()=0D
+ if len(sel) > 0:=0D
+ page_id =3D sel[0]=0D
+ self.build_config_data_page(page_id)=0D
+ self.update_widgets_visibility_on_page()=0D
+ self.update_page_scroll_bar()=0D
+=0D
+ def walk_widgets_in_layout(self, parent, callback_function, args=3DNon=
e):=0D
+ for widget in parent.winfo_children():=0D
+ callback_function(widget, args)=0D
+=0D
+ def clear_widgets_inLayout(self, parent=3DNone):=0D
+ if parent is None:=0D
+ parent =3D self.right_grid=0D
+=0D
+ for widget in parent.winfo_children():=0D
+ widget.destroy()=0D
+=0D
+ parent.grid_forget()=0D
+ self.conf_list.clear()=0D
+=0D
+ def build_config_page_tree(self, cfg_page, parent):=0D
+ for page in cfg_page['child']:=0D
+ page_id =3D next(iter(page))=0D
+ # Put CFG items into related page list=0D
+ self.page_list[page_id] =3D self.cfg_data_obj.get_cfg_list(pag=
e_id)=0D
+ self.page_list[page_id].sort(key=3Dlambda x: x['order'])=0D
+ page_name =3D self.cfg_data_obj.get_page_title(page_id)=0D
+ child =3D self.left.insert(=0D
+ parent, 'end',=0D
+ iid=3Dpage_id, text=3Dpage_name,=0D
+ value=3D0)=0D
+ if len(page[page_id]) > 0:=0D
+ self.build_config_page_tree(page[page_id], child)=0D
+=0D
+ def is_config_data_loaded(self):=0D
+ return True if len(self.page_list) else False=0D
+=0D
+ def set_current_config_page(self, page_id):=0D
+ self.page_id =3D page_id=0D
+=0D
+ def get_current_config_page(self):=0D
+ return self.page_id=0D
+=0D
+ def get_current_config_data(self):=0D
+ page_id =3D self.get_current_config_page()=0D
+ if page_id in self.page_list:=0D
+ return self.page_list[page_id]=0D
+ else:=0D
+ return []=0D
+=0D
+ invalid_values =3D {}=0D
+=0D
+ def build_config_data_page(self, page_id):=0D
+ self.clear_widgets_inLayout()=0D
+ self.set_current_config_page(page_id)=0D
+ disp_list =3D []=0D
+ for item in self.get_current_config_data():=0D
+ disp_list.append(item)=0D
+ row =3D 0=0D
+ disp_list.sort(key=3Dlambda x: x['order'])=0D
+ for item in disp_list:=0D
+ self.add_config_item(item, row)=0D
+ row +=3D 2=0D
+ if self.invalid_values:=0D
+ string =3D 'The following contails invalid options/values \n\n=
'=0D
+ for i in self.invalid_values:=0D
+ string +=3D i + ": " + str(self.invalid_values[i]) + "\n"=
=0D
+ reply =3D messagebox.showwarning('Warning!', string)=0D
+ if reply =3D=3D 'ok':=0D
+ self.invalid_values.clear()=0D
+=0D
+ fsp_version =3D ''=0D
+=0D
+ def load_config_data(self, file_name):=0D
+ gen_cfg_data =3D CGenYamlCfg()=0D
+ if file_name.endswith('.pkl'):=0D
+ with open(file_name, "rb") as pkl_file:=0D
+ gen_cfg_data.__dict__ =3D marshal.load(pkl_file)=0D
+ gen_cfg_data.prepare_marshal(False)=0D
+ elif file_name.endswith('.yaml'):=0D
+ if gen_cfg_data.load_yaml(file_name) !=3D 0:=0D
+ raise Exception(gen_cfg_data.get_last_error())=0D
+ else:=0D
+ raise Exception('Unsupported file "%s" !' % file_name)=0D
+ # checking fsp version=0D
+ if gen_cfg_data.detect_fsp():=0D
+ self.fsp_version =3D '2.X'=0D
+ else:=0D
+ self.fsp_version =3D '1.X'=0D
+ return gen_cfg_data=0D
+=0D
+ def about(self):=0D
+ msg =3D 'Configuration Editor\n--------------------------------\n =
\=0D
+ Version 0.8\n2021'=0D
+ lines =3D msg.split('\n')=0D
+ width =3D 30=0D
+ text =3D []=0D
+ for line in lines:=0D
+ text.append(line.center(width, ' '))=0D
+ messagebox.showinfo('Config Editor', '\n'.join(text))=0D
+=0D
+ def update_last_dir(self, path):=0D
+ self.last_dir =3D os.path.dirname(path)=0D
+=0D
+ def get_open_file_name(self, ftype):=0D
+ if self.is_config_data_loaded():=0D
+ if ftype =3D=3D 'dlt':=0D
+ question =3D ''=0D
+ elif ftype =3D=3D 'bin':=0D
+ question =3D 'All configuration will be reloaded from BIN =
file, \=0D
+ continue ?'=0D
+ elif ftype =3D=3D 'yaml':=0D
+ question =3D ''=0D
+ elif ftype =3D=3D 'bsf':=0D
+ question =3D ''=0D
+ else:=0D
+ raise Exception('Unsupported file type !')=0D
+ if question:=0D
+ reply =3D messagebox.askquestion('', question, icon=3D'war=
ning')=0D
+ if reply =3D=3D 'no':=0D
+ return None=0D
+=0D
+ if ftype =3D=3D 'yaml':=0D
+ if self.mode =3D=3D 'FSP':=0D
+ file_type =3D 'YAML'=0D
+ file_ext =3D 'yaml'=0D
+ else:=0D
+ file_type =3D 'YAML or PKL'=0D
+ file_ext =3D 'pkl *.yaml'=0D
+ else:=0D
+ file_type =3D ftype.upper()=0D
+ file_ext =3D ftype=0D
+=0D
+ path =3D filedialog.askopenfilename(=0D
+ initialdir=3Dself.last_dir,=0D
+ title=3D"Load file",=0D
+ filetypes=3D(("%s files" % file_type, "*.%s" % file_ext), =
(=0D
+ "all files", "*.*")))=0D
+ if path:=0D
+ self.update_last_dir(path)=0D
+ return path=0D
+ else:=0D
+ return None=0D
+=0D
+ def load_from_delta(self):=0D
+ path =3D self.get_open_file_name('dlt')=0D
+ if not path:=0D
+ return=0D
+ self.load_delta_file(path)=0D
+=0D
+ def load_delta_file(self, path):=0D
+ self.reload_config_data_from_bin(self.org_cfg_data_bin)=0D
+ try:=0D
+ self.cfg_data_obj.override_default_value(path)=0D
+ except Exception as e:=0D
+ messagebox.showerror('LOADING ERROR', str(e))=0D
+ return=0D
+ self.update_last_dir(path)=0D
+ self.refresh_config_data_page()=0D
+=0D
+ def load_from_bin(self):=0D
+ path =3D filedialog.askopenfilename(=0D
+ initialdir=3Dself.last_dir,=0D
+ title=3D"Load file",=0D
+ filetypes=3D{("Binaries", "*.fv *.fd *.bin *.rom")})=0D
+ if not path:=0D
+ return=0D
+ self.load_bin_file(path)=0D
+=0D
+ def load_bin_file(self, path):=0D
+ with open(path, 'rb') as fd:=0D
+ bin_data =3D bytearray(fd.read())=0D
+ if len(bin_data) < len(self.org_cfg_data_bin):=0D
+ messagebox.showerror('Binary file size is smaller than what \=
=0D
+ YAML requires !')=0D
+ return=0D
+=0D
+ try:=0D
+ self.reload_config_data_from_bin(bin_data)=0D
+ except Exception as e:=0D
+ messagebox.showerror('LOADING ERROR', str(e))=0D
+ return=0D
+=0D
+ def load_from_bsf_file(self):=0D
+ path =3D self.get_open_file_name('bsf')=0D
+ if not path:=0D
+ return=0D
+ self.load_bsf_file(path)=0D
+=0D
+ def load_bsf_file(self, path):=0D
+ bsf_file =3D path=0D
+ dsc_file =3D os.path.splitext(bsf_file)[0] + '.dsc'=0D
+ yaml_file =3D os.path.splitext(bsf_file)[0] + '.yaml'=0D
+ bsf_to_dsc(bsf_file, dsc_file)=0D
+ dsc_to_yaml(dsc_file, yaml_file)=0D
+=0D
+ self.load_cfg_file(yaml_file)=0D
+ return=0D
+=0D
+ def load_from_fd(self):=0D
+ path =3D filedialog.askopenfilename(=0D
+ initialdir=3Dself.last_dir,=0D
+ title=3D"Load file",=0D
+ filetypes=3D{("Binaries", "*.fv *.fd *.bin *.rom")})=0D
+ if not path:=0D
+ return=0D
+ self.load_fd_file(path)=0D
+=0D
+ def load_fd_file(self, path):=0D
+ with open(path, 'rb') as fd:=0D
+ bin_data =3D bytearray(fd.read())=0D
+=0D
+ fd =3D FirmwareDevice(0, bin_data)=0D
+ fd.ParseFd()=0D
+ fd.ParseFsp()=0D
+ fd.OutputFsp()=0D
+=0D
+ def load_cfg_file(self, path):=0D
+ # Save current values in widget and clear database=0D
+ self.clear_widgets_inLayout()=0D
+ self.left.delete(*self.left.get_children())=0D
+=0D
+ self.cfg_data_obj =3D self.load_config_data(path)=0D
+=0D
+ self.update_last_dir(path)=0D
+ self.org_cfg_data_bin =3D self.cfg_data_obj.generate_binary_array(=
)=0D
+ self.build_config_page_tree(self.cfg_data_obj.get_cfg_page()['root=
'],=0D
+ '')=0D
+=0D
+ msg_string =3D 'Click YES if it is FULL FSP '\=0D
+ + self.fsp_version + ' Binary'=0D
+ reply =3D messagebox.askquestion('Form', msg_string)=0D
+ if reply =3D=3D 'yes':=0D
+ self.load_from_bin()=0D
+=0D
+ for menu in self.menu_string:=0D
+ self.file_menu.entryconfig(menu, state=3D"normal")=0D
+=0D
+ return 0=0D
+=0D
+ def load_from_yaml(self):=0D
+ path =3D self.get_open_file_name('yaml')=0D
+ if not path:=0D
+ return=0D
+=0D
+ self.load_cfg_file(path)=0D
+=0D
+ def get_save_file_name(self, extension):=0D
+ path =3D filedialog.asksaveasfilename(=0D
+ initialdir=3Dself.last_dir,=0D
+ title=3D"Save file",=0D
+ defaultextension=3Dextension)=0D
+ if path:=0D
+ self.last_dir =3D os.path.dirname(path)=0D
+ return path=0D
+ else:=0D
+ return None=0D
+=0D
+ def save_delta_file(self, full=3DFalse):=0D
+ path =3D self.get_save_file_name(".dlt")=0D
+ if not path:=0D
+ return=0D
+=0D
+ self.update_config_data_on_page()=0D
+ new_data =3D self.cfg_data_obj.generate_binary_array()=0D
+ self.cfg_data_obj.generate_delta_file_from_bin(path,=0D
+ self.org_cfg_data_b=
in,=0D
+ new_data, full)=0D
+=0D
+ def save_to_delta(self):=0D
+ self.save_delta_file()=0D
+=0D
+ def save_full_to_delta(self):=0D
+ self.save_delta_file(True)=0D
+=0D
+ def save_to_bin(self):=0D
+ path =3D self.get_save_file_name(".bin")=0D
+ if not path:=0D
+ return=0D
+=0D
+ self.update_config_data_on_page()=0D
+ bins =3D self.cfg_data_obj.save_current_to_bin()=0D
+=0D
+ with open(path, 'wb') as fd:=0D
+ fd.write(bins)=0D
+=0D
+ def refresh_config_data_page(self):=0D
+ self.clear_widgets_inLayout()=0D
+ self.on_config_page_select_change(None)=0D
+=0D
+ def reload_config_data_from_bin(self, bin_dat):=0D
+ self.cfg_data_obj.load_default_from_bin(bin_dat)=0D
+ self.refresh_config_data_page()=0D
+=0D
+ def set_config_item_value(self, item, value_str):=0D
+ itype =3D item['type'].split(',')[0]=0D
+ if itype =3D=3D "Table":=0D
+ new_value =3D value_str=0D
+ elif itype =3D=3D "EditText":=0D
+ length =3D (self.cfg_data_obj.get_cfg_item_length(item) + 7) /=
/ 8=0D
+ new_value =3D value_str[:length]=0D
+ if item['value'].startswith("'"):=0D
+ new_value =3D "'%s'" % new_value=0D
+ else:=0D
+ try:=0D
+ new_value =3D self.cfg_data_obj.reformat_value_str(=0D
+ value_str,=0D
+ self.cfg_data_obj.get_cfg_item_length(item),=0D
+ item['value'])=0D
+ except Exception:=0D
+ print("WARNING: Failed to format value string '%s' for '%s=
' !"=0D
+ % (value_str, item['path']))=0D
+ new_value =3D item['value']=0D
+=0D
+ if item['value'] !=3D new_value:=0D
+ if self.debug:=0D
+ print('Update %s from %s to %s !'=0D
+ % (item['cname'], item['value'], new_value))=0D
+ item['value'] =3D new_value=0D
+=0D
+ def get_config_data_item_from_widget(self, widget, label=3DFalse):=0D
+ name =3D self.get_object_name(widget)=0D
+ if not name or not len(self.page_list):=0D
+ return None=0D
+=0D
+ if name.startswith('LABEL_'):=0D
+ if label:=0D
+ path =3D name[6:]=0D
+ else:=0D
+ return None=0D
+ else:=0D
+ path =3D name=0D
+ item =3D self.cfg_data_obj.get_item_by_path(path)=0D
+ return item=0D
+=0D
+ def update_config_data_from_widget(self, widget, args):=0D
+ item =3D self.get_config_data_item_from_widget(widget)=0D
+ if item is None:=0D
+ return=0D
+ elif not item:=0D
+ if isinstance(widget, tkinter.Label):=0D
+ return=0D
+ raise Exception('Failed to find "%s" !' %=0D
+ self.get_object_name(widget))=0D
+=0D
+ itype =3D item['type'].split(',')[0]=0D
+ if itype =3D=3D "Combo":=0D
+ opt_list =3D self.cfg_data_obj.get_cfg_item_options(item)=0D
+ tmp_list =3D [opt[0] for opt in opt_list]=0D
+ idx =3D widget.current()=0D
+ if idx !=3D -1:=0D
+ self.set_config_item_value(item, tmp_list[idx])=0D
+ elif itype in ["EditNum", "EditText"]:=0D
+ self.set_config_item_value(item, widget.get())=0D
+ elif itype in ["Table"]:=0D
+ new_value =3D bytes_to_bracket_str(widget.get())=0D
+ self.set_config_item_value(item, new_value)=0D
+=0D
+ def evaluate_condition(self, item):=0D
+ try:=0D
+ result =3D self.cfg_data_obj.evaluate_condition(item)=0D
+ except Exception:=0D
+ print("WARNING: Condition '%s' is invalid for '%s' !"=0D
+ % (item['condition'], item['path']))=0D
+ result =3D 1=0D
+ return result=0D
+=0D
+ def add_config_item(self, item, row):=0D
+ parent =3D self.right_grid=0D
+=0D
+ name =3D tkinter.Label(parent, text=3Ditem['name'], anchor=3D"w")=
=0D
+=0D
+ parts =3D item['type'].split(',')=0D
+ itype =3D parts[0]=0D
+ widget =3D None=0D
+=0D
+ if itype =3D=3D "Combo":=0D
+ # Build=0D
+ opt_list =3D self.cfg_data_obj.get_cfg_item_options(item)=0D
+ current_value =3D self.cfg_data_obj.get_cfg_item_value(item, F=
alse)=0D
+ option_list =3D []=0D
+ current =3D None=0D
+=0D
+ for idx, option in enumerate(opt_list):=0D
+ option_str =3D option[0]=0D
+ try:=0D
+ option_value =3D self.cfg_data_obj.get_value(=0D
+ option_str,=0D
+ len(option_str), False)=0D
+ except Exception:=0D
+ option_value =3D 0=0D
+ print('WARNING: Option "%s" has invalid format for "%s=
" !'=0D
+ % (option_str, item['path']))=0D
+ if option_value =3D=3D current_value:=0D
+ current =3D idx=0D
+ option_list.append(option[1])=0D
+=0D
+ widget =3D ttk.Combobox(parent, value=3Doption_list, state=3D"=
readonly")=0D
+ widget.bind("<<ComboboxSelected>>", self.combo_select_changed)=
=0D
+ widget.unbind_class("TCombobox", "<MouseWheel>")=0D
+=0D
+ if current is None:=0D
+ print('WARNING: Value "%s" is an invalid option for "%s" !=
' %=0D
+ (current_value, item['path']))=0D
+ self.invalid_values[item['path']] =3D current_value=0D
+ else:=0D
+ widget.current(current)=0D
+=0D
+ elif itype in ["EditNum", "EditText"]:=0D
+ txt_val =3D tkinter.StringVar()=0D
+ widget =3D tkinter.Entry(parent, textvariable=3Dtxt_val)=0D
+ value =3D item['value'].strip("'")=0D
+ if itype in ["EditText"]:=0D
+ txt_val.trace(=0D
+ 'w',=0D
+ lambda *args: self.limit_entry_size=0D
+ (txt_val, (self.cfg_data_obj.get_cfg_item_length(item)=
=0D
+ + 7) // 8))=0D
+ elif itype in ["EditNum"]:=0D
+ value =3D item['value'].strip("{").strip("}").strip()=0D
+ widget.bind("<FocusOut>", self.edit_num_finished)=0D
+ txt_val.set(value)=0D
+=0D
+ elif itype in ["Table"]:=0D
+ bins =3D self.cfg_data_obj.get_cfg_item_value(item, True)=0D
+ col_hdr =3D item['option'].split(',')=0D
+ widget =3D custom_table(parent, col_hdr, bins)=0D
+=0D
+ else:=0D
+ if itype and itype not in ["Reserved"]:=0D
+ print("WARNING: Type '%s' is invalid for '%s' !" %=0D
+ (itype, item['path']))=0D
+ self.invalid_values[item['path']] =3D itype=0D
+=0D
+ if widget:=0D
+ create_tool_tip(widget, item['help'])=0D
+ self.set_object_name(name, 'LABEL_' + item['path'])=0D
+ self.set_object_name(widget, item['path'])=0D
+ name.grid(row=3Drow, column=3D0, padx=3D10, pady=3D5, sticky=
=3D"nsew")=0D
+ widget.grid(row=3Drow + 1, rowspan=3D1, column=3D0,=0D
+ padx=3D10, pady=3D5, sticky=3D"nsew")=0D
+=0D
+ def update_config_data_on_page(self):=0D
+ self.walk_widgets_in_layout(self.right_grid,=0D
+ self.update_config_data_from_widget)=0D
+=0D
+=0D
+if __name__ =3D=3D '__main__':=0D
+ root =3D tkinter.Tk()=0D
+ app =3D application(master=3Droot)=0D
+ root.title("Config Editor")=0D
+ root.mainloop()=0D
diff --git a/IntelFsp2Pkg/Tools/ConfigEditor/GenYamlCfg.py b/IntelFsp2Pkg/T=
ools/ConfigEditor/GenYamlCfg.py
new file mode 100644
index 0000000000..25fd9c547e
--- /dev/null
+++ b/IntelFsp2Pkg/Tools/ConfigEditor/GenYamlCfg.py
@@ -0,0 +1,2252 @@
+# @ GenYamlCfg.py=0D
+#=0D
+# Copyright (c) 2020 - 2021, Intel Corporation. All rights reserved.<BR>=0D
+# SPDX-License-Identifier: BSD-2-Clause-Patent=0D
+#=0D
+#=0D
+=0D
+import os=0D
+import sys=0D
+import re=0D
+import marshal=0D
+import string=0D
+import operator as op=0D
+import ast=0D
+import tkinter.messagebox as messagebox=0D
+=0D
+from datetime import date=0D
+from collections import OrderedDict=0D
+from CommonUtility import value_to_bytearray, value_to_bytes, \=0D
+ bytes_to_value, get_bits_from_bytes, set_bits_to_bytes=0D
+=0D
+# Generated file copyright header=0D
+__copyright_tmp__ =3D """/** @file=0D
+=0D
+ Platform Configuration %s File.=0D
+=0D
+ Copyright (c) %4d, Intel Corporation. All rights reserved.<BR>=0D
+ SPDX-License-Identifier: BSD-2-Clause-Patent=0D
+=0D
+ This file is automatically generated. Please do NOT modify !!!=0D
+=0D
+**/=0D
+"""=0D
+=0D
+=0D
+def get_copyright_header(file_type, allow_modify=3DFalse):=0D
+ file_description =3D {=0D
+ 'yaml': 'Boot Setting',=0D
+ 'dlt': 'Delta',=0D
+ 'inc': 'C Binary Blob',=0D
+ 'h': 'C Struct Header'=0D
+ }=0D
+ if file_type in ['yaml', 'dlt']:=0D
+ comment_char =3D '#'=0D
+ else:=0D
+ comment_char =3D ''=0D
+ lines =3D __copyright_tmp__.split('\n')=0D
+ if allow_modify:=0D
+ lines =3D [line for line in lines if 'Please do NOT modify' not in=
line]=0D
+ copyright_hdr =3D '\n'.join('%s%s' % (comment_char, line)=0D
+ for line in lines)[:-1] + '\n'=0D
+ return copyright_hdr % (file_description[file_type], date.today().year=
)=0D
+=0D
+=0D
+def check_quote(text):=0D
+ if (text[0] =3D=3D "'" and text[-1] =3D=3D "'") or (text[0] =3D=3D '"'=
=0D
+ and text[-1] =3D=3D '"'):=
=0D
+ return True=0D
+ return False=0D
+=0D
+=0D
+def strip_quote(text):=0D
+ new_text =3D text.strip()=0D
+ if check_quote(new_text):=0D
+ return new_text[1:-1]=0D
+ return text=0D
+=0D
+=0D
+def strip_delimiter(text, delim):=0D
+ new_text =3D text.strip()=0D
+ if new_text:=0D
+ if new_text[0] =3D=3D delim[0] and new_text[-1] =3D=3D delim[-1]:=
=0D
+ return new_text[1:-1]=0D
+ return text=0D
+=0D
+=0D
+def bytes_to_bracket_str(bytes):=0D
+ return '{ %s }' % (', '.join('0x%02x' % i for i in bytes))=0D
+=0D
+=0D
+def array_str_to_value(val_str):=0D
+ val_str =3D val_str.strip()=0D
+ val_str =3D strip_delimiter(val_str, '{}')=0D
+ val_str =3D strip_quote(val_str)=0D
+ value =3D 0=0D
+ for each in val_str.split(',')[::-1]:=0D
+ each =3D each.strip()=0D
+ value =3D (value << 8) | int(each, 0)=0D
+ return value=0D
+=0D
+=0D
+def write_lines(lines, file):=0D
+ fo =3D open(file, "w")=0D
+ fo.write(''.join([x[0] for x in lines]))=0D
+ fo.close()=0D
+=0D
+=0D
+def read_lines(file):=0D
+ if not os.path.exists(file):=0D
+ test_file =3D os.path.basename(file)=0D
+ if os.path.exists(test_file):=0D
+ file =3D test_file=0D
+ fi =3D open(file, 'r')=0D
+ lines =3D fi.readlines()=0D
+ fi.close()=0D
+ return lines=0D
+=0D
+=0D
+def expand_file_value(path, value_str):=0D
+ result =3D bytearray()=0D
+ match =3D re.match("\\{\\s*FILE:(.+)\\}", value_str)=0D
+ if match:=0D
+ file_list =3D match.group(1).split(',')=0D
+ for file in file_list:=0D
+ file =3D file.strip()=0D
+ bin_path =3D os.path.join(path, file)=0D
+ result.extend(bytearray(open(bin_path, 'rb').read()))=0D
+ print('\n\n result ', result)=0D
+ return result=0D
+=0D
+=0D
+class ExpressionEval(ast.NodeVisitor):=0D
+ operators =3D {=0D
+ ast.Add: op.add,=0D
+ ast.Sub: op.sub,=0D
+ ast.Mult: op.mul,=0D
+ ast.Div: op.floordiv,=0D
+ ast.Mod: op.mod,=0D
+ ast.Eq: op.eq,=0D
+ ast.NotEq: op.ne,=0D
+ ast.Gt: op.gt,=0D
+ ast.Lt: op.lt,=0D
+ ast.GtE: op.ge,=0D
+ ast.LtE: op.le,=0D
+ ast.BitXor: op.xor,=0D
+ ast.BitAnd: op.and_,=0D
+ ast.BitOr: op.or_,=0D
+ ast.Invert: op.invert,=0D
+ ast.USub: op.neg=0D
+ }=0D
+=0D
+ def __init__(self):=0D
+ self._debug =3D False=0D
+ self._expression =3D ''=0D
+ self._namespace =3D {}=0D
+ self._get_variable =3D None=0D
+=0D
+ def eval(self, expr, vars=3D{}):=0D
+ self._expression =3D expr=0D
+ if type(vars) is dict:=0D
+ self._namespace =3D vars=0D
+ self._get_variable =3D None=0D
+ else:=0D
+ self._namespace =3D {}=0D
+ self._get_variable =3D vars=0D
+ node =3D ast.parse(self._expression, mode=3D'eval')=0D
+ result =3D self.visit(node.body)=0D
+ if self._debug:=0D
+ print('EVAL [ %s ] =3D %s' % (expr, str(result)))=0D
+ return result=0D
+=0D
+ def visit_Name(self, node):=0D
+ if self._get_variable is not None:=0D
+ return self._get_variable(node.id)=0D
+ else:=0D
+ return self._namespace[node.id]=0D
+=0D
+ def visit_Num(self, node):=0D
+ return node.n=0D
+=0D
+ def visit_NameConstant(self, node):=0D
+ return node.value=0D
+=0D
+ def visit_BoolOp(self, node):=0D
+ result =3D False=0D
+ if isinstance(node.op, ast.And):=0D
+ for value in node.values:=0D
+ result =3D self.visit(value)=0D
+ if not result:=0D
+ break=0D
+ elif isinstance(node.op, ast.Or):=0D
+ for value in node.values:=0D
+ result =3D self.visit(value)=0D
+ if result:=0D
+ break=0D
+ return True if result else False=0D
+=0D
+ def visit_UnaryOp(self, node):=0D
+ val =3D self.visit(node.operand)=0D
+ return ExpressionEval.operators[type(node.op)](val)=0D
+=0D
+ def visit_BinOp(self, node):=0D
+ lhs =3D self.visit(node.left)=0D
+ rhs =3D self.visit(node.right)=0D
+ return ExpressionEval.operators[type(node.op)](lhs, rhs)=0D
+=0D
+ def visit_Compare(self, node):=0D
+ right =3D self.visit(node.left)=0D
+ result =3D True=0D
+ for operation, comp in zip(node.ops, node.comparators):=0D
+ if not result:=0D
+ break=0D
+ left =3D right=0D
+ right =3D self.visit(comp)=0D
+ result =3D ExpressionEval.operators[type(operation)](left, rig=
ht)=0D
+ return result=0D
+=0D
+ def visit_Call(self, node):=0D
+ if node.func.id in ['ternary']:=0D
+ condition =3D self.visit(node.args[0])=0D
+ val_true =3D self.visit(node.args[1])=0D
+ val_false =3D self.visit(node.args[2])=0D
+ return val_true if condition else val_false=0D
+ elif node.func.id in ['offset', 'length']:=0D
+ if self._get_variable is not None:=0D
+ return self._get_variable(node.args[0].s, node.func.id)=0D
+ else:=0D
+ raise ValueError("Unsupported function: " + repr(node))=0D
+=0D
+ def generic_visit(self, node):=0D
+ raise ValueError("malformed node or string: " + repr(node))=0D
+=0D
+=0D
+class CFG_YAML():=0D
+ TEMPLATE =3D 'template'=0D
+ CONFIGS =3D 'configs'=0D
+ VARIABLE =3D 'variable'=0D
+=0D
+ def __init__(self):=0D
+ self.log_line =3D False=0D
+ self.allow_template =3D False=0D
+ self.cfg_tree =3D None=0D
+ self.tmp_tree =3D None=0D
+ self.var_dict =3D None=0D
+ self.def_dict =3D {}=0D
+ self.yaml_path =3D ''=0D
+ self.lines =3D []=0D
+ self.full_lines =3D []=0D
+ self.index =3D 0=0D
+ self.re_expand =3D re.compile(=0D
+ r'(.+:\s+|\s*\-\s*)!expand\s+\{\s*(\w+_TMPL)\s*:\s*\[(.+)]\s*\=
}')=0D
+ self.re_include =3D re.compile(r'(.+:\s+|\s*\-\s*)!include\s+(.+)'=
)=0D
+=0D
+ @staticmethod=0D
+ def count_indent(line):=0D
+ return next((i for i, c in enumerate(line) if not c.isspace()),=0D
+ len(line))=0D
+=0D
+ @staticmethod=0D
+ def substitue_args(text, arg_dict):=0D
+ for arg in arg_dict:=0D
+ text =3D text.replace('$' + arg, arg_dict[arg])=0D
+ return text=0D
+=0D
+ @staticmethod=0D
+ def dprint(*args):=0D
+ pass=0D
+=0D
+ def process_include(self, line, insert=3DTrue):=0D
+ match =3D self.re_include.match(line)=0D
+ if not match:=0D
+ raise Exception("Invalid !include format '%s' !" % line.strip(=
))=0D
+=0D
+ prefix =3D match.group(1)=0D
+ include =3D match.group(2)=0D
+ if prefix.strip() =3D=3D '-':=0D
+ prefix =3D ''=0D
+ adjust =3D 0=0D
+ else:=0D
+ adjust =3D 2=0D
+=0D
+ include =3D strip_quote(include)=0D
+ request =3D CFG_YAML.count_indent(line) + adjust=0D
+=0D
+ if self.log_line:=0D
+ # remove the include line itself=0D
+ del self.full_lines[-1]=0D
+=0D
+ inc_path =3D os.path.join(self.yaml_path, include)=0D
+ if not os.path.exists(inc_path):=0D
+ # try relative path to project root=0D
+ try_path =3D os.path.join(os.path.dirname(os.path.realpath(__f=
ile__)=0D
+ ), "../..", include)=0D
+ if os.path.exists(try_path):=0D
+ inc_path =3D try_path=0D
+ else:=0D
+ raise Exception("ERROR: Cannot open file '%s'." % inc_path=
)=0D
+=0D
+ lines =3D read_lines(inc_path)=0D
+ current =3D 0=0D
+ same_line =3D False=0D
+ for idx, each in enumerate(lines):=0D
+ start =3D each.lstrip()=0D
+ if start =3D=3D '' or start[0] =3D=3D '#':=0D
+ continue=0D
+=0D
+ if start[0] =3D=3D '>':=0D
+ # append the content directly at the same line=0D
+ same_line =3D True=0D
+=0D
+ start =3D idx=0D
+ current =3D CFG_YAML.count_indent(each)=0D
+ break=0D
+=0D
+ lines =3D lines[start+1:] if same_line else lines[start:]=0D
+ leading =3D ''=0D
+ if same_line:=0D
+ request =3D len(prefix)=0D
+ leading =3D '>'=0D
+=0D
+ lines =3D [prefix + '%s\n' % leading] + [' ' * request +=0D
+ i[current:] for i in lines]=
=0D
+ if insert:=0D
+ self.lines =3D lines + self.lines=0D
+=0D
+ return lines=0D
+=0D
+ def process_expand(self, line):=0D
+ match =3D self.re_expand.match(line)=0D
+ if not match:=0D
+ raise Exception("Invalid !expand format '%s' !" % line.strip()=
)=0D
+ lines =3D []=0D
+ prefix =3D match.group(1)=0D
+ temp_name =3D match.group(2)=0D
+ args =3D match.group(3)=0D
+=0D
+ if prefix.strip() =3D=3D '-':=0D
+ indent =3D 0=0D
+ else:=0D
+ indent =3D 2=0D
+ lines =3D self.process_expand_template(temp_name, prefix, args, in=
dent)=0D
+ self.lines =3D lines + self.lines=0D
+=0D
+ def process_expand_template(self, temp_name, prefix, args, indent=3D2)=
:=0D
+ # expand text with arg substitution=0D
+ if temp_name not in self.tmp_tree:=0D
+ raise Exception("Could not find template '%s' !" % temp_name)=
=0D
+ parts =3D args.split(',')=0D
+ parts =3D [i.strip() for i in parts]=0D
+ num =3D len(parts)=0D
+ arg_dict =3D dict(zip(['(%d)' % (i + 1) for i in range(num)], part=
s))=0D
+ str_data =3D self.tmp_tree[temp_name]=0D
+ text =3D DefTemplate(str_data).safe_substitute(self.def_dict)=0D
+ text =3D CFG_YAML.substitue_args(text, arg_dict)=0D
+ target =3D CFG_YAML.count_indent(prefix) + indent=0D
+ current =3D CFG_YAML.count_indent(text)=0D
+ padding =3D target * ' '=0D
+ if indent =3D=3D 0:=0D
+ leading =3D []=0D
+ else:=0D
+ leading =3D [prefix + '\n']=0D
+ text =3D leading + [(padding + i + '\n')[current:]=0D
+ for i in text.splitlines()]=0D
+ return text=0D
+=0D
+ def load_file(self, yaml_file):=0D
+ self.index =3D 0=0D
+ self.lines =3D read_lines(yaml_file)=0D
+=0D
+ def peek_line(self):=0D
+ if len(self.lines) =3D=3D 0:=0D
+ return None=0D
+ else:=0D
+ return self.lines[0]=0D
+=0D
+ def put_line(self, line):=0D
+ self.lines.insert(0, line)=0D
+ if self.log_line:=0D
+ del self.full_lines[-1]=0D
+=0D
+ def get_line(self):=0D
+ if len(self.lines) =3D=3D 0:=0D
+ return None=0D
+ else:=0D
+ line =3D self.lines.pop(0)=0D
+ if self.log_line:=0D
+ self.full_lines.append(line.rstrip())=0D
+ return line=0D
+=0D
+ def get_multiple_line(self, indent):=0D
+ text =3D ''=0D
+ newind =3D indent + 1=0D
+ while True:=0D
+ line =3D self.peek_line()=0D
+ if line is None:=0D
+ break=0D
+ sline =3D line.strip()=0D
+ if sline !=3D '':=0D
+ newind =3D CFG_YAML.count_indent(line)=0D
+ if newind <=3D indent:=0D
+ break=0D
+ self.get_line()=0D
+ if sline !=3D '':=0D
+ text =3D text + line=0D
+ return text=0D
+=0D
+ def traverse_cfg_tree(self, handler):=0D
+ def _traverse_cfg_tree(root, level=3D0):=0D
+ # config structure=0D
+ for key in root:=0D
+ if type(root[key]) is OrderedDict:=0D
+ level +=3D 1=0D
+ handler(key, root[key], level)=0D
+ _traverse_cfg_tree(root[key], level)=0D
+ level -=3D 1=0D
+ _traverse_cfg_tree(self.cfg_tree)=0D
+=0D
+ def count(self):=0D
+ def _count(name, cfgs, level):=0D
+ num[0] +=3D 1=0D
+ num =3D [0]=0D
+ self.traverse_cfg_tree(_count)=0D
+ return num[0]=0D
+=0D
+ def parse(self, parent_name=3D'', curr=3DNone, level=3D0):=0D
+ child =3D None=0D
+ last_indent =3D None=0D
+ key =3D ''=0D
+ temp_chk =3D {}=0D
+=0D
+ while True:=0D
+ line =3D self.get_line()=0D
+ if line is None:=0D
+ break=0D
+=0D
+ curr_line =3D line.strip()=0D
+ if curr_line =3D=3D '' or curr_line[0] =3D=3D '#':=0D
+ continue=0D
+=0D
+ indent =3D CFG_YAML.count_indent(line)=0D
+ if last_indent is None:=0D
+ last_indent =3D indent=0D
+=0D
+ if indent !=3D last_indent:=0D
+ # outside of current block, put the line back to queue=0D
+ self.put_line(' ' * indent + curr_line)=0D
+=0D
+ if curr_line.endswith(': >'):=0D
+ # multiline marker=0D
+ old_count =3D len(self.full_lines)=0D
+ line =3D self.get_multiple_line(indent)=0D
+ if self.log_line and not self.allow_template \=0D
+ and '!include ' in line:=0D
+ # expand include in template=0D
+ new_lines =3D []=0D
+ lines =3D line.splitlines()=0D
+ for idx, each in enumerate(lines):=0D
+ if '!include ' in each:=0D
+ new_line =3D ''.join(self.process_include(each=
,=0D
+ False)=
)=0D
+ new_lines.append(new_line)=0D
+ else:=0D
+ new_lines.append(each)=0D
+ self.full_lines =3D self.full_lines[:old_count] + new_=
lines=0D
+ curr_line =3D curr_line + line=0D
+=0D
+ if indent > last_indent:=0D
+ # child nodes=0D
+ if child is None:=0D
+ raise Exception('Unexpected format at line: %s'=0D
+ % (curr_line))=0D
+=0D
+ level +=3D 1=0D
+ self.parse(key, child, level)=0D
+ level -=3D 1=0D
+ line =3D self.peek_line()=0D
+ if line is not None:=0D
+ curr_line =3D line.strip()=0D
+ indent =3D CFG_YAML.count_indent(line)=0D
+ if indent >=3D last_indent:=0D
+ # consume the line=0D
+ self.get_line()=0D
+ else:=0D
+ # end of file=0D
+ indent =3D -1=0D
+=0D
+ if curr is None:=0D
+ curr =3D OrderedDict()=0D
+=0D
+ if indent < last_indent:=0D
+ return curr=0D
+=0D
+ marker1 =3D curr_line[0]=0D
+ marker2 =3D curr_line[-1]=0D
+ start =3D 1 if marker1 =3D=3D '-' else 0=0D
+ pos =3D curr_line.find(': ')=0D
+ if pos > 0:=0D
+ child =3D None=0D
+ key =3D curr_line[start:pos].strip()=0D
+ if curr_line[pos + 2] =3D=3D '>':=0D
+ curr[key] =3D curr_line[pos + 3:]=0D
+ else:=0D
+ # XXXX: !include / !expand=0D
+ if '!include ' in curr_line:=0D
+ self.process_include(line)=0D
+ elif '!expand ' in curr_line:=0D
+ if self.allow_template and not self.log_line:=0D
+ self.process_expand(line)=0D
+ else:=0D
+ value_str =3D curr_line[pos + 2:].strip()=0D
+ curr[key] =3D value_str=0D
+ if self.log_line and value_str[0] =3D=3D '{':=0D
+ # expand {FILE: xxxx} format in the log line=0D
+ if value_str[1:].rstrip().startswith('FILE:'):=
=0D
+ value_bytes =3D expand_file_value(=0D
+ self.yaml_path, value_str)=0D
+ value_str =3D bytes_to_bracket_str(value_b=
ytes)=0D
+ self.full_lines[-1] =3D line[=0D
+ :indent] + curr_line[:pos + 2] + value=
_str=0D
+=0D
+ elif marker2 =3D=3D ':':=0D
+ child =3D OrderedDict()=0D
+ key =3D curr_line[start:-1].strip()=0D
+ if key =3D=3D '$ACTION':=0D
+ # special virtual nodes, rename to ensure unique key=0D
+ key =3D '$ACTION_%04X' % self.index=0D
+ self.index +=3D 1=0D
+ if key in curr:=0D
+ if key not in temp_chk:=0D
+ # check for duplicated keys at same level=0D
+ temp_chk[key] =3D 1=0D
+ else:=0D
+ raise Exception("Duplicated item '%s:%s' found !"=
=0D
+ % (parent_name, key))=0D
+=0D
+ curr[key] =3D child=0D
+ if self.var_dict is None and key =3D=3D CFG_YAML.VARIABLE:=
=0D
+ self.var_dict =3D child=0D
+ if self.tmp_tree is None and key =3D=3D CFG_YAML.TEMPLATE:=
=0D
+ self.tmp_tree =3D child=0D
+ if self.var_dict:=0D
+ for each in self.var_dict:=0D
+ txt =3D self.var_dict[each]=0D
+ if type(txt) is str:=0D
+ self.def_dict['(%s)' % each] =3D txt=0D
+ if self.tmp_tree and key =3D=3D CFG_YAML.CONFIGS:=0D
+ # apply template for the main configs=0D
+ self.allow_template =3D True=0D
+ else:=0D
+ child =3D None=0D
+ # - !include cfg_opt.yaml=0D
+ if '!include ' in curr_line:=0D
+ self.process_include(line)=0D
+=0D
+ return curr=0D
+=0D
+ def load_yaml(self, opt_file):=0D
+ self.var_dict =3D None=0D
+ self.yaml_path =3D os.path.dirname(opt_file)=0D
+ self.load_file(opt_file)=0D
+ yaml_tree =3D self.parse()=0D
+ self.tmp_tree =3D yaml_tree[CFG_YAML.TEMPLATE]=0D
+ self.cfg_tree =3D yaml_tree[CFG_YAML.CONFIGS]=0D
+ return self.cfg_tree=0D
+=0D
+ def expand_yaml(self, opt_file):=0D
+ self.log_line =3D True=0D
+ self.load_yaml(opt_file)=0D
+ self.log_line =3D False=0D
+ text =3D '\n'.join(self.full_lines)=0D
+ self.full_lines =3D []=0D
+ return text=0D
+=0D
+=0D
+class DefTemplate(string.Template):=0D
+ idpattern =3D '\\([_A-Z][_A-Z0-9]*\\)|[_A-Z][_A-Z0-9]*'=0D
+=0D
+=0D
+class CGenYamlCfg:=0D
+ STRUCT =3D '$STRUCT'=0D
+ bits_width =3D {'b': 1, 'B': 8, 'W': 16, 'D': 32, 'Q': 64}=0D
+ builtin_option =3D {'$EN_DIS': [('0', 'Disable'), ('1', 'Enable')]}=0D
+ exclude_struct =3D ['FSP_UPD_HEADER', 'FSPT_ARCH_UPD',=0D
+ 'FSPM_ARCH_UPD', 'FSPS_ARCH_UPD',=0D
+ 'GPIO_GPP_*', 'GPIO_CFG_DATA',=0D
+ 'GpioConfPad*', 'GpioPinConfig',=0D
+ 'BOOT_OPTION*', 'PLATFORMID_CFG_DATA', '\\w+_Half[01=
]']=0D
+ include_tag =3D ['GPIO_CFG_DATA']=0D
+ keyword_set =3D set(['name', 'type', 'option', 'help', 'length',=0D
+ 'value', 'order', 'struct', 'condition'])=0D
+=0D
+ def __init__(self):=0D
+ self._mode =3D ''=0D
+ self._debug =3D False=0D
+ self._macro_dict =3D {}=0D
+ self.initialize()=0D
+=0D
+ def initialize(self):=0D
+ self._old_bin =3D None=0D
+ self._cfg_tree =3D {}=0D
+ self._tmp_tree =3D {}=0D
+ self._cfg_list =3D []=0D
+ self._cfg_page =3D {'root': {'title': '', 'child': []}}=0D
+ self._cur_page =3D ''=0D
+ self._var_dict =3D {}=0D
+ self._def_dict =3D {}=0D
+ self._yaml_path =3D ''=0D
+=0D
+ @staticmethod=0D
+ def deep_convert_dict(layer):=0D
+ # convert OrderedDict to list + dict=0D
+ new_list =3D layer=0D
+ if isinstance(layer, OrderedDict):=0D
+ new_list =3D list(layer.items())=0D
+ for idx, pair in enumerate(new_list):=0D
+ new_node =3D CGenYamlCfg.deep_convert_dict(pair[1])=0D
+ new_list[idx] =3D dict({pair[0]: new_node})=0D
+ return new_list=0D
+=0D
+ @staticmethod=0D
+ def deep_convert_list(layer):=0D
+ if isinstance(layer, list):=0D
+ od =3D OrderedDict({})=0D
+ for each in layer:=0D
+ if isinstance(each, dict):=0D
+ key =3D next(iter(each))=0D
+ od[key] =3D CGenYamlCfg.deep_convert_list(each[key])=0D
+ return od=0D
+ else:=0D
+ return layer=0D
+=0D
+ @staticmethod=0D
+ def expand_include_files(file_path, cur_dir=3D''):=0D
+ if cur_dir =3D=3D '':=0D
+ cur_dir =3D os.path.dirname(file_path)=0D
+ file_path =3D os.path.basename(file_path)=0D
+=0D
+ input_file_path =3D os.path.join(cur_dir, file_path)=0D
+ file =3D open(input_file_path, "r")=0D
+ lines =3D file.readlines()=0D
+ file.close()=0D
+ new_lines =3D []=0D
+ for line_num, line in enumerate(lines):=0D
+ match =3D re.match("^!include\\s*(.+)?$", line.strip())=0D
+ if match:=0D
+ inc_path =3D match.group(1)=0D
+ tmp_path =3D os.path.join(cur_dir, inc_path)=0D
+ org_path =3D tmp_path=0D
+ if not os.path.exists(tmp_path):=0D
+ cur_dir =3D os.path.join(os.path.dirname=0D
+ (os.path.realpath(__file__)=0D
+ ), "..", "..")=0D
+ tmp_path =3D os.path.join(cur_dir, inc_path)=0D
+ if not os.path.exists(tmp_path):=0D
+ raise Exception("ERROR: Cannot open include\=0D
+ file '%s'." % org_path)=0D
+ else:=0D
+ new_lines.append(('# Included from file: %s\n' % inc_p=
ath,=0D
+ tmp_path, 0))=0D
+ new_lines.append(('# %s\n' % ('=3D' * 80), tmp_path, 0=
))=0D
+ new_lines.extend(CGenYamlCfg.expand_include_files=0D
+ (inc_path, cur_dir))=0D
+ else:=0D
+ new_lines.append((line, input_file_path, line_num))=0D
+=0D
+ return new_lines=0D
+=0D
+ @staticmethod=0D
+ def format_struct_field_name(input, count=3D0):=0D
+ name =3D ''=0D
+ cap =3D True=0D
+ if '_' in input:=0D
+ input =3D input.lower()=0D
+ for each in input:=0D
+ if each =3D=3D '_':=0D
+ cap =3D True=0D
+ continue=0D
+ elif cap:=0D
+ each =3D each.upper()=0D
+ cap =3D False=0D
+ name =3D name + each=0D
+=0D
+ if count > 1:=0D
+ name =3D '%s[%d]' % (name, count)=0D
+=0D
+ return name=0D
+=0D
+ def get_mode(self):=0D
+ return self._mode=0D
+=0D
+ def set_mode(self, mode):=0D
+ self._mode =3D mode=0D
+=0D
+ def get_last_error(self):=0D
+ return ''=0D
+=0D
+ def get_variable(self, var, attr=3D'value'):=0D
+ if var in self._var_dict:=0D
+ var =3D self._var_dict[var]=0D
+ return var=0D
+=0D
+ item =3D self.locate_cfg_item(var, False)=0D
+ if item is None:=0D
+ raise ValueError("Cannot find variable '%s' !" % var)=0D
+=0D
+ if item:=0D
+ if 'indx' in item:=0D
+ item =3D self.get_item_by_index(item['indx'])=0D
+ if attr =3D=3D 'offset':=0D
+ var =3D item['offset']=0D
+ elif attr =3D=3D 'length':=0D
+ var =3D item['length']=0D
+ elif attr =3D=3D 'value':=0D
+ var =3D self.get_cfg_item_value(item)=0D
+ else:=0D
+ raise ValueError("Unsupported variable attribute '%s' !" %=
=0D
+ attr)=0D
+ return var=0D
+=0D
+ def eval(self, expr):=0D
+ def _handler(pattern):=0D
+ if pattern.group(1):=0D
+ target =3D 1=0D
+ else:=0D
+ target =3D 2=0D
+ result =3D self.get_variable(pattern.group(target))=0D
+ if result is None:=0D
+ raise ValueError('Unknown variable $(%s) !' %=0D
+ pattern.group(target))=0D
+ return hex(result)=0D
+=0D
+ expr_eval =3D ExpressionEval()=0D
+ if '$' in expr:=0D
+ # replace known variable first=0D
+ expr =3D re.sub(r'\$\(([_a-zA-Z][\w\.]*)\)|\$([_a-zA-Z][\w\.]*=
)',=0D
+ _handler, expr)=0D
+ return expr_eval.eval(expr, self.get_variable)=0D
+=0D
+ def parse_macros(self, macro_def_str):=0D
+ # ['-DABC=3D1', '-D', 'CFG_DEBUG=3D1', '-D', 'CFG_OUTDIR=3DBuild']=
=0D
+ self._macro_dict =3D {}=0D
+ is_expression =3D False=0D
+ for macro in macro_def_str:=0D
+ if macro.startswith('-D'):=0D
+ is_expression =3D True=0D
+ if len(macro) > 2:=0D
+ macro =3D macro[2:]=0D
+ else:=0D
+ continue=0D
+ if is_expression:=0D
+ is_expression =3D False=0D
+ match =3D re.match("(\\w+)=3D(.+)", macro)=0D
+ if match:=0D
+ self._macro_dict[match.group(1)] =3D match.group(2)=0D
+ else:=0D
+ match =3D re.match("(\\w+)", macro)=0D
+ if match:=0D
+ self._macro_dict[match.group(1)] =3D ''=0D
+ if len(self._macro_dict) =3D=3D 0:=0D
+ error =3D 1=0D
+ else:=0D
+ error =3D 0=0D
+ if self._debug:=0D
+ print("INFO : Macro dictionary:")=0D
+ for each in self._macro_dict:=0D
+ print(" $(%s) =3D [ %s ]"=0D
+ % (each, self._macro_dict[each]))=0D
+ return error=0D
+=0D
+ def get_cfg_list(self, page_id=3DNone):=0D
+ if page_id is None:=0D
+ # return full list=0D
+ return self._cfg_list=0D
+ else:=0D
+ # build a new list for items under a page ID=0D
+ cfgs =3D [i for i in self._cfg_list if i['cname'] and=0D
+ (i['page'] =3D=3D page_id)]=0D
+ return cfgs=0D
+=0D
+ def get_cfg_page(self):=0D
+ return self._cfg_page=0D
+=0D
+ def get_cfg_item_length(self, item):=0D
+ return item['length']=0D
+=0D
+ def get_cfg_item_value(self, item, array=3DFalse):=0D
+ value_str =3D item['value']=0D
+ length =3D item['length']=0D
+ return self.get_value(value_str, length, array)=0D
+=0D
+ def format_value_to_str(self, value, bit_length, old_value=3D''):=0D
+ # value is always int=0D
+ length =3D (bit_length + 7) // 8=0D
+ fmt =3D ''=0D
+ if old_value.startswith('0x'):=0D
+ fmt =3D '0x'=0D
+ elif old_value and (old_value[0] in ['"', "'", '{']):=0D
+ fmt =3D old_value[0]=0D
+ else:=0D
+ fmt =3D ''=0D
+=0D
+ bvalue =3D value_to_bytearray(value, length)=0D
+ if fmt in ['"', "'"]:=0D
+ svalue =3D bvalue.rstrip(b'\x00').decode()=0D
+ value_str =3D fmt + svalue + fmt=0D
+ elif fmt =3D=3D "{":=0D
+ value_str =3D '{ ' + ', '.join(['0x%02x' % i for i in bvalue])=
+ ' }'=0D
+ elif fmt =3D=3D '0x':=0D
+ hex_len =3D length * 2=0D
+ if len(old_value) =3D=3D hex_len + 2:=0D
+ fstr =3D '0x%%0%dx' % hex_len=0D
+ else:=0D
+ fstr =3D '0x%x'=0D
+ value_str =3D fstr % value=0D
+ else:=0D
+ if length <=3D 2:=0D
+ value_str =3D '%d' % value=0D
+ elif length <=3D 8:=0D
+ value_str =3D '0x%x' % value=0D
+ else:=0D
+ value_str =3D '{ ' + ', '.join(['0x%02x' % i for i in=0D
+ bvalue]) + ' }'=0D
+ return value_str=0D
+=0D
+ def reformat_value_str(self, value_str, bit_length, old_value=3DNone):=
=0D
+ value =3D self.parse_value(value_str, bit_length, False)=0D
+ if old_value is None:=0D
+ old_value =3D value_str=0D
+ new_value =3D self.format_value_to_str(value, bit_length, old_valu=
e)=0D
+ return new_value=0D
+=0D
+ def get_value(self, value_str, bit_length, array=3DTrue):=0D
+ value_str =3D value_str.strip()=0D
+ if value_str[0] =3D=3D "'" and value_str[-1] =3D=3D "'" or \=0D
+ value_str[0] =3D=3D '"' and value_str[-1] =3D=3D '"':=0D
+ value_str =3D value_str[1:-1]=0D
+ bvalue =3D bytearray(value_str.encode())=0D
+ if len(bvalue) =3D=3D 0:=0D
+ bvalue =3D bytearray(b'\x00')=0D
+ if array:=0D
+ return bvalue=0D
+ else:=0D
+ return bytes_to_value(bvalue)=0D
+ else:=0D
+ if value_str[0] in '{':=0D
+ value_str =3D value_str[1:-1].strip()=0D
+ value =3D 0=0D
+ for each in value_str.split(',')[::-1]:=0D
+ each =3D each.strip()=0D
+ value =3D (value << 8) | int(each, 0)=0D
+ if array:=0D
+ length =3D (bit_length + 7) // 8=0D
+ return value_to_bytearray(value, length)=0D
+ else:=0D
+ return value=0D
+=0D
+ def parse_value(self, value_str, bit_length, array=3DTrue):=0D
+ length =3D (bit_length + 7) // 8=0D
+ if check_quote(value_str):=0D
+ value_str =3D bytes_to_bracket_str(value_str[1:-1].encode())=0D
+ elif (',' in value_str) and (value_str[0] !=3D '{'):=0D
+ value_str =3D '{ %s }' % value_str=0D
+ if value_str[0] =3D=3D '{':=0D
+ result =3D expand_file_value(self._yaml_path, value_str)=0D
+ if len(result) =3D=3D 0:=0D
+ bin_list =3D value_str[1:-1].split(',')=0D
+ value =3D 0=0D
+ bit_len =3D 0=0D
+ unit_len =3D 1=0D
+ for idx, element in enumerate(bin_list):=0D
+ each =3D element.strip()=0D
+ if len(each) =3D=3D 0:=0D
+ continue=0D
+=0D
+ in_bit_field =3D False=0D
+ if each[0] in "'" + '"':=0D
+ each_value =3D bytearray(each[1:-1], 'utf-8')=0D
+ elif ':' in each:=0D
+ match =3D re.match("^(.+):(\\d+)([b|B|W|D|Q])$", e=
ach)=0D
+ if match is None:=0D
+ raise SystemExit("Exception: Invald value\=0D
+list format '%s' !" % each)=0D
+ if match.group(1) =3D=3D '0' and match.group(2) =
=3D=3D '0':=0D
+ unit_len =3D CGenYamlCfg.bits_width[match.grou=
p(3)=0D
+ ] // 8=0D
+ cur_bit_len =3D int(match.group(2)=0D
+ ) * CGenYamlCfg.bits_width[=0D
+ match.group(3)]=0D
+ value +=3D ((self.eval(match.group(1)) & (=0D
+ 1 << cur_bit_len) - 1)) << bit_len=0D
+ bit_len +=3D cur_bit_len=0D
+ each_value =3D bytearray()=0D
+ if idx + 1 < len(bin_list):=0D
+ in_bit_field =3D True=0D
+ else:=0D
+ try:=0D
+ each_value =3D value_to_bytearray(=0D
+ self.eval(each.strip()), unit_len)=0D
+ except Exception:=0D
+ raise SystemExit("Exception: Value %d cannot \=
=0D
+fit into %s bytes !" % (each, unit_len))=0D
+=0D
+ if not in_bit_field:=0D
+ if bit_len > 0:=0D
+ if bit_len % 8 !=3D 0:=0D
+ raise SystemExit("Exception: Invalid bit \=
=0D
+field alignment '%s' !" % value_str)=0D
+ result.extend(value_to_bytes(value, bit_len //=
8))=0D
+ value =3D 0=0D
+ bit_len =3D 0=0D
+=0D
+ result.extend(each_value)=0D
+=0D
+ elif check_quote(value_str):=0D
+ result =3D bytearray(value_str[1:-1], 'utf-8') # Excluding qu=
otes=0D
+ else:=0D
+ result =3D value_to_bytearray(self.eval(value_str), length)=0D
+=0D
+ if len(result) < length:=0D
+ result.extend(b'\x00' * (length - len(result)))=0D
+ elif len(result) > length:=0D
+ raise SystemExit("Exception: Value '%s' is too big to fit \=0D
+into %d bytes !" % (value_str, length))=0D
+=0D
+ if array:=0D
+ return result=0D
+ else:=0D
+ return bytes_to_value(result)=0D
+=0D
+ return result=0D
+=0D
+ def get_cfg_item_options(self, item):=0D
+ tmp_list =3D []=0D
+ if item['type'] =3D=3D "Combo":=0D
+ if item['option'] in CGenYamlCfg.builtin_option:=0D
+ for op_val, op_str in CGenYamlCfg.builtin_option[item['opt=
ion'=0D
+ ]]:=
=0D
+ tmp_list.append((op_val, op_str))=0D
+ else:=0D
+ opt_list =3D item['option'].split(',')=0D
+ for option in opt_list:=0D
+ option =3D option.strip()=0D
+ try:=0D
+ (op_val, op_str) =3D option.split(':')=0D
+ except Exception:=0D
+ raise SystemExit("Exception: Invalide \=0D
+option format '%s' !" % option)=0D
+ tmp_list.append((op_val, op_str))=0D
+ return tmp_list=0D
+=0D
+ def get_page_title(self, page_id, top=3DNone):=0D
+ if top is None:=0D
+ top =3D self.get_cfg_page()['root']=0D
+ for node in top['child']:=0D
+ page_key =3D next(iter(node))=0D
+ if page_id =3D=3D page_key:=0D
+ return node[page_key]['title']=0D
+ else:=0D
+ result =3D self.get_page_title(page_id, node[page_key])=0D
+ if result is not None:=0D
+ return result=0D
+ return None=0D
+=0D
+ def print_pages(self, top=3DNone, level=3D0):=0D
+ if top is None:=0D
+ top =3D self.get_cfg_page()['root']=0D
+ for node in top['child']:=0D
+ page_id =3D next(iter(node))=0D
+ print('%s%s: %s' % (' ' * level, page_id, node[page_id]['titl=
e']))=0D
+ level +=3D 1=0D
+ self.print_pages(node[page_id], level)=0D
+ level -=3D 1=0D
+=0D
+ def get_item_by_index(self, index):=0D
+ return self._cfg_list[index]=0D
+=0D
+ def get_item_by_path(self, path):=0D
+ node =3D self.locate_cfg_item(path)=0D
+ if node:=0D
+ return self.get_item_by_index(node['indx'])=0D
+ else:=0D
+ return None=0D
+=0D
+ def locate_cfg_path(self, item):=0D
+ def _locate_cfg_path(root, level=3D0):=0D
+ # config structure=0D
+ if item is root:=0D
+ return path=0D
+ for key in root:=0D
+ if type(root[key]) is OrderedDict:=0D
+ level +=3D 1=0D
+ path.append(key)=0D
+ ret =3D _locate_cfg_path(root[key], level)=0D
+ if ret:=0D
+ return ret=0D
+ path.pop()=0D
+ return None=0D
+ path =3D []=0D
+ return _locate_cfg_path(self._cfg_tree)=0D
+=0D
+ def locate_cfg_item(self, path, allow_exp=3DTrue):=0D
+ def _locate_cfg_item(root, path, level=3D0):=0D
+ if len(path) =3D=3D level:=0D
+ return root=0D
+ next_root =3D root.get(path[level], None)=0D
+ if next_root is None:=0D
+ if allow_exp:=0D
+ raise Exception('Not a valid CFG config option path: %=
s' %=0D
+ '.'.join(path[:level+1]))=0D
+ else:=0D
+ return None=0D
+ return _locate_cfg_item(next_root, path, level + 1)=0D
+=0D
+ path_nodes =3D path.split('.')=0D
+ return _locate_cfg_item(self._cfg_tree, path_nodes)=0D
+=0D
+ def traverse_cfg_tree(self, handler, top=3DNone):=0D
+ def _traverse_cfg_tree(root, level=3D0):=0D
+ # config structure=0D
+ for key in root:=0D
+ if type(root[key]) is OrderedDict:=0D
+ level +=3D 1=0D
+ handler(key, root[key], level)=0D
+ _traverse_cfg_tree(root[key], level)=0D
+ level -=3D 1=0D
+=0D
+ if top is None:=0D
+ top =3D self._cfg_tree=0D
+ _traverse_cfg_tree(top)=0D
+=0D
+ def print_cfgs(self, root=3DNone, short=3DTrue, print_level=3D256):=0D
+ def _print_cfgs(name, cfgs, level):=0D
+=0D
+ if 'indx' in cfgs:=0D
+ act_cfg =3D self.get_item_by_index(cfgs['indx'])=0D
+ else:=0D
+ offset =3D 0=0D
+ length =3D 0=0D
+ value =3D ''=0D
+ if CGenYamlCfg.STRUCT in cfgs:=0D
+ cfg =3D cfgs[CGenYamlCfg.STRUCT]=0D
+ offset =3D int(cfg['offset'])=0D
+ length =3D int(cfg['length'])=0D
+ if 'value' in cfg:=0D
+ value =3D cfg['value']=0D
+ if length =3D=3D 0:=0D
+ return=0D
+ act_cfg =3D dict({'value': value, 'offset': offset,=0D
+ 'length': length})=0D
+ value =3D act_cfg['value']=0D
+ bit_len =3D act_cfg['length']=0D
+ offset =3D (act_cfg['offset'] + 7) // 8=0D
+ if value !=3D '':=0D
+ try:=0D
+ value =3D self.reformat_value_str(act_cfg['value'],=0D
+ act_cfg['length'])=0D
+ except Exception:=0D
+ value =3D act_cfg['value']=0D
+ length =3D bit_len // 8=0D
+ bit_len =3D '(%db)' % bit_len if bit_len % 8 else '' * 4=0D
+ if level <=3D print_level:=0D
+ if short and len(value) > 40:=0D
+ value =3D '%s ... %s' % (value[:20], value[-20:])=0D
+ print('%04X:%04X%-6s %s%s : %s' % (offset, length, bit_len=
,=0D
+ ' ' * level, name, val=
ue))=0D
+=0D
+ self.traverse_cfg_tree(_print_cfgs)=0D
+=0D
+ def build_var_dict(self):=0D
+ def _build_var_dict(name, cfgs, level):=0D
+ if level <=3D 2:=0D
+ if CGenYamlCfg.STRUCT in cfgs:=0D
+ struct_info =3D cfgs[CGenYamlCfg.STRUCT]=0D
+ self._var_dict['_LENGTH_%s_' % name] =3D struct_info[=
=0D
+ 'length'] // 8=0D
+ self._var_dict['_OFFSET_%s_' % name] =3D struct_info[=
=0D
+ 'offset'] // 8=0D
+=0D
+ self._var_dict =3D {}=0D
+ self.traverse_cfg_tree(_build_var_dict)=0D
+ self._var_dict['_LENGTH_'] =3D self._cfg_tree[CGenYamlCfg.STRUCT][=
=0D
+ 'length'] // 8=0D
+ return 0=0D
+=0D
+ def add_cfg_page(self, child, parent, title=3D''):=0D
+ def _add_cfg_page(cfg_page, child, parent):=0D
+ key =3D next(iter(cfg_page))=0D
+ if parent =3D=3D key:=0D
+ cfg_page[key]['child'].append({child: {'title': title,=0D
+ 'child': []}})=0D
+ return True=0D
+ else:=0D
+ result =3D False=0D
+ for each in cfg_page[key]['child']:=0D
+ if _add_cfg_page(each, child, parent):=0D
+ result =3D True=0D
+ break=0D
+ return result=0D
+=0D
+ return _add_cfg_page(self._cfg_page, child, parent)=0D
+=0D
+ def set_cur_page(self, page_str):=0D
+ if not page_str:=0D
+ return=0D
+=0D
+ if ',' in page_str:=0D
+ page_list =3D page_str.split(',')=0D
+ else:=0D
+ page_list =3D [page_str]=0D
+ for page_str in page_list:=0D
+ parts =3D page_str.split(':')=0D
+ if len(parts) in [1, 3]:=0D
+ page =3D parts[0].strip()=0D
+ if len(parts) =3D=3D 3:=0D
+ # it is a new page definition, add it into tree=0D
+ parent =3D parts[1] if parts[1] else 'root'=0D
+ parent =3D parent.strip()=0D
+ if parts[2][0] =3D=3D '"' and parts[2][-1] =3D=3D '"':=
=0D
+ parts[2] =3D parts[2][1:-1]=0D
+=0D
+ if not self.add_cfg_page(page, parent, parts[2]):=0D
+ raise SystemExit("Error: Cannot find parent page \=
=0D
+'%s'!" % parent)=0D
+ else:=0D
+ raise SystemExit("Error: Invalid page format '%s' !"=0D
+ % page_str)=0D
+ self._cur_page =3D page=0D
+=0D
+ def extend_variable(self, line):=0D
+ # replace all variables=0D
+ if line =3D=3D '':=0D
+ return line=0D
+ loop =3D 2=0D
+ while loop > 0:=0D
+ line_after =3D DefTemplate(line).safe_substitute(self._def_dic=
t)=0D
+ if line =3D=3D line_after:=0D
+ break=0D
+ loop -=3D 1=0D
+ line =3D line_after=0D
+ return line_after=0D
+=0D
+ def reformat_number_per_type(self, itype, value):=0D
+ if check_quote(value) or value.startswith('{'):=0D
+ return value=0D
+ parts =3D itype.split(',')=0D
+ if len(parts) > 3 and parts[0] =3D=3D 'EditNum':=0D
+ num_fmt =3D parts[1].strip()=0D
+ else:=0D
+ num_fmt =3D ''=0D
+ if num_fmt =3D=3D 'HEX' and not value.startswith('0x'):=0D
+ value =3D '0x%X' % int(value, 10)=0D
+ elif num_fmt =3D=3D 'DEC' and value.startswith('0x'):=0D
+ value =3D '%d' % int(value, 16)=0D
+ return value=0D
+=0D
+ def add_cfg_item(self, name, item, offset, path):=0D
+=0D
+ self.set_cur_page(item.get('page', ''))=0D
+=0D
+ if name[0] =3D=3D '$':=0D
+ # skip all virtual node=0D
+ return 0=0D
+=0D
+ if not set(item).issubset(CGenYamlCfg.keyword_set):=0D
+ for each in list(item):=0D
+ if each not in CGenYamlCfg.keyword_set:=0D
+ raise Exception("Invalid attribute '%s' for '%s'!" %=0D
+ (each, '.'.join(path)))=0D
+=0D
+ length =3D item.get('length', 0)=0D
+ if type(length) is str:=0D
+ match =3D re.match("^(\\d+)([b|B|W|D|Q])([B|W|D|Q]?)\\s*$", le=
ngth)=0D
+ if match:=0D
+ unit_len =3D CGenYamlCfg.bits_width[match.group(2)]=0D
+ length =3D int(match.group(1), 10) * unit_len=0D
+ else:=0D
+ try:=0D
+ length =3D int(length, 0) * 8=0D
+ except Exception:=0D
+ raise Exception("Invalid length field '%s' for '%s' !"=
%=0D
+ (length, '.'.join(path)))=0D
+=0D
+ if offset % 8 > 0:=0D
+ raise Exception("Invalid alignment for field '%s' for =
\=0D
+'%s' !" % (name, '.'.join(path)))=0D
+ else:=0D
+ # define is length in bytes=0D
+ length =3D length * 8=0D
+=0D
+ if not name.isidentifier():=0D
+ raise Exception("Invalid config name '%s' for '%s' !" %=0D
+ (name, '.'.join(path)))=0D
+=0D
+ itype =3D str(item.get('type', 'Reserved'))=0D
+ value =3D str(item.get('value', ''))=0D
+ if value:=0D
+ if not (check_quote(value) or value.startswith('{')):=0D
+ if ',' in value:=0D
+ value =3D '{ %s }' % value=0D
+ else:=0D
+ value =3D self.reformat_number_per_type(itype, value)=
=0D
+=0D
+ help =3D str(item.get('help', ''))=0D
+ if '\n' in help:=0D
+ help =3D ' '.join([i.strip() for i in help.splitlines()])=0D
+=0D
+ option =3D str(item.get('option', ''))=0D
+ if '\n' in option:=0D
+ option =3D ' '.join([i.strip() for i in option.splitlines()])=
=0D
+=0D
+ # extend variables for value and condition=0D
+ condition =3D str(item.get('condition', ''))=0D
+ if condition:=0D
+ condition =3D self.extend_variable(condition)=0D
+ value =3D self.extend_variable(value)=0D
+=0D
+ order =3D str(item.get('order', ''))=0D
+ if order:=0D
+ if '.' in order:=0D
+ (major, minor) =3D order.split('.')=0D
+ order =3D int(major, 16)=0D
+ else:=0D
+ order =3D int(order, 16)=0D
+ else:=0D
+ order =3D offset=0D
+=0D
+ cfg_item =3D dict()=0D
+ cfg_item['length'] =3D length=0D
+ cfg_item['offset'] =3D offset=0D
+ cfg_item['value'] =3D value=0D
+ cfg_item['type'] =3D itype=0D
+ cfg_item['cname'] =3D str(name)=0D
+ cfg_item['name'] =3D str(item.get('name', ''))=0D
+ cfg_item['help'] =3D help=0D
+ cfg_item['option'] =3D option=0D
+ cfg_item['page'] =3D self._cur_page=0D
+ cfg_item['order'] =3D order=0D
+ cfg_item['path'] =3D '.'.join(path)=0D
+ cfg_item['condition'] =3D condition=0D
+ if 'struct' in item:=0D
+ cfg_item['struct'] =3D item['struct']=0D
+ self._cfg_list.append(cfg_item)=0D
+=0D
+ item['indx'] =3D len(self._cfg_list) - 1=0D
+=0D
+ # remove used info for reducing pkl size=0D
+ item.pop('option', None)=0D
+ item.pop('condition', None)=0D
+ item.pop('help', None)=0D
+ item.pop('name', None)=0D
+ item.pop('page', None)=0D
+=0D
+ return length=0D
+=0D
+ def build_cfg_list(self, cfg_name=3D'', top=3DNone, path=3D[],=0D
+ info=3D{'offset': 0}):=0D
+ if top is None:=0D
+ top =3D self._cfg_tree=0D
+ info.clear()=0D
+ info =3D {'offset': 0}=0D
+=0D
+ start =3D info['offset']=0D
+ is_leaf =3D True=0D
+ for key in top:=0D
+ path.append(key)=0D
+ if type(top[key]) is OrderedDict:=0D
+ is_leaf =3D False=0D
+ self.build_cfg_list(key, top[key], path, info)=0D
+ path.pop()=0D
+=0D
+ if is_leaf:=0D
+ length =3D self.add_cfg_item(cfg_name, top, info['offset'], pa=
th)=0D
+ info['offset'] +=3D length=0D
+ elif cfg_name =3D=3D '' or (cfg_name and cfg_name[0] !=3D '$'):=0D
+ # check first element for struct=0D
+ first =3D next(iter(top))=0D
+ struct_str =3D CGenYamlCfg.STRUCT=0D
+ if first !=3D struct_str:=0D
+ struct_node =3D OrderedDict({})=0D
+ top[struct_str] =3D struct_node=0D
+ top.move_to_end(struct_str, False)=0D
+ else:=0D
+ struct_node =3D top[struct_str]=0D
+ struct_node['offset'] =3D start=0D
+ struct_node['length'] =3D info['offset'] - start=0D
+ if struct_node['length'] % 8 !=3D 0:=0D
+ raise SystemExit("Error: Bits length not aligned for %s !"=
%=0D
+ str(path))=0D
+=0D
+ def get_field_value(self, top=3DNone):=0D
+ def _get_field_value(name, cfgs, level):=0D
+ if 'indx' in cfgs:=0D
+ act_cfg =3D self.get_item_by_index(cfgs['indx'])=0D
+ if act_cfg['length'] =3D=3D 0:=0D
+ return=0D
+ value =3D self.get_value(act_cfg['value'], act_cfg['length=
'],=0D
+ False)=0D
+ set_bits_to_bytes(result, act_cfg['offset'] -=0D
+ struct_info['offset'], act_cfg['length']=
,=0D
+ value)=0D
+=0D
+ if top is None:=0D
+ top =3D self._cfg_tree=0D
+ struct_info =3D top[CGenYamlCfg.STRUCT]=0D
+ result =3D bytearray((struct_info['length'] + 7) // 8)=0D
+ self.traverse_cfg_tree(_get_field_value, top)=0D
+ return result=0D
+=0D
+ def set_field_value(self, top, value_bytes, force=3DFalse):=0D
+ def _set_field_value(name, cfgs, level):=0D
+ if 'indx' not in cfgs:=0D
+ return=0D
+ act_cfg =3D self.get_item_by_index(cfgs['indx'])=0D
+ if force or act_cfg['value'] =3D=3D '':=0D
+ value =3D get_bits_from_bytes(full_bytes,=0D
+ act_cfg['offset'] -=0D
+ struct_info['offset'],=0D
+ act_cfg['length'])=0D
+ act_val =3D act_cfg['value']=0D
+ if act_val =3D=3D '':=0D
+ act_val =3D '%d' % value=0D
+ act_val =3D self.reformat_number_per_type(act_cfg=0D
+ ['type'],=0D
+ act_val)=0D
+ act_cfg['value'] =3D self.format_value_to_str(=0D
+ value, act_cfg['length'], act_val)=0D
+=0D
+ if 'indx' in top:=0D
+ # it is config option=0D
+ value =3D bytes_to_value(value_bytes)=0D
+ act_cfg =3D self.get_item_by_index(top['indx'])=0D
+ act_cfg['value'] =3D self.format_value_to_str(=0D
+ value, act_cfg['length'], act_cfg['value'])=0D
+ else:=0D
+ # it is structure=0D
+ struct_info =3D top[CGenYamlCfg.STRUCT]=0D
+ length =3D struct_info['length'] // 8=0D
+ full_bytes =3D bytearray(value_bytes[:length])=0D
+ if len(full_bytes) < length:=0D
+ full_bytes.extend(bytearray(length - len(value_bytes)))=0D
+ self.traverse_cfg_tree(_set_field_value, top)=0D
+=0D
+ def update_def_value(self):=0D
+ def _update_def_value(name, cfgs, level):=0D
+ if 'indx' in cfgs:=0D
+ act_cfg =3D self.get_item_by_index(cfgs['indx'])=0D
+ if act_cfg['value'] !=3D '' and act_cfg['length'] > 0:=0D
+ try:=0D
+ act_cfg['value'] =3D self.reformat_value_str(=0D
+ act_cfg['value'], act_cfg['length'])=0D
+ except Exception:=0D
+ raise Exception("Invalid value expression '%s' \=0D
+for '%s' !" % (act_cfg['value'], act_cfg['path']))=0D
+ else:=0D
+ if CGenYamlCfg.STRUCT in cfgs and 'value' in \=0D
+ cfgs[CGenYamlCfg.STRUCT]:=0D
+ curr =3D cfgs[CGenYamlCfg.STRUCT]=0D
+ value_bytes =3D self.get_value(curr['value'],=0D
+ curr['length'], True)=0D
+ self.set_field_value(cfgs, value_bytes)=0D
+=0D
+ self.traverse_cfg_tree(_update_def_value, self._cfg_tree)=0D
+=0D
+ def evaluate_condition(self, item):=0D
+ expr =3D item['condition']=0D
+ result =3D self.parse_value(expr, 1, False)=0D
+ return result=0D
+=0D
+ def detect_fsp(self):=0D
+ cfg_segs =3D self.get_cfg_segment()=0D
+ if len(cfg_segs) =3D=3D 3:=0D
+ fsp =3D True=0D
+ for idx, seg in enumerate(cfg_segs):=0D
+ if not seg[0].endswith('UPD_%s' % 'TMS'[idx]):=0D
+ fsp =3D False=0D
+ break=0D
+ else:=0D
+ fsp =3D False=0D
+ if fsp:=0D
+ self.set_mode('FSP')=0D
+ return fsp=0D
+=0D
+ def get_cfg_segment(self):=0D
+ def _get_cfg_segment(name, cfgs, level):=0D
+ if 'indx' not in cfgs:=0D
+ if name.startswith('$ACTION_'):=0D
+ if 'find' in cfgs:=0D
+ find[0] =3D cfgs['find']=0D
+ else:=0D
+ if find[0]:=0D
+ act_cfg =3D self.get_item_by_index(cfgs['indx'])=0D
+ segments.append([find[0], act_cfg['offset'] // 8, 0])=
=0D
+ find[0] =3D ''=0D
+ return=0D
+=0D
+ find =3D ['']=0D
+ segments =3D []=0D
+ self.traverse_cfg_tree(_get_cfg_segment, self._cfg_tree)=0D
+ cfg_len =3D self._cfg_tree[CGenYamlCfg.STRUCT]['length'] // 8=0D
+ if len(segments) =3D=3D 0:=0D
+ segments.append(['', 0, cfg_len])=0D
+=0D
+ segments.append(['', cfg_len, 0])=0D
+ cfg_segs =3D []=0D
+ for idx, each in enumerate(segments[:-1]):=0D
+ cfg_segs.append((each[0], each[1],=0D
+ segments[idx+1][1] - each[1]))=0D
+=0D
+ return cfg_segs=0D
+=0D
+ def get_bin_segment(self, bin_data):=0D
+ cfg_segs =3D self.get_cfg_segment()=0D
+ bin_segs =3D []=0D
+ for seg in cfg_segs:=0D
+ key =3D seg[0].encode()=0D
+ if key =3D=3D 0:=0D
+ bin_segs.append([seg[0], 0, len(bin_data)])=0D
+ break=0D
+ pos =3D bin_data.find(key)=0D
+ if pos >=3D 0:=0D
+ # ensure no other match for the key=0D
+ next_pos =3D bin_data.find(key, pos + len(seg[0]))=0D
+ if next_pos >=3D 0:=0D
+ if key =3D=3D b'$SKLFSP$' or key =3D=3D b'$BSWFSP$':=0D
+ string =3D ('Warning: Multiple matches for %s in '=
=0D
+ 'binary!\n\nA workaround applied to such=
'=0D
+ 'FSP 1.x binary to use second'=0D
+ ' match instead of first match!' % key)=
=0D
+ messagebox.showwarning('Warning!', string)=0D
+ pos =3D next_pos=0D
+ else:=0D
+ print("Warning: Multiple matches for '%s' "=0D
+ "in binary, the 1st instance will be used !"=
=0D
+ % seg[0])=0D
+ bin_segs.append([seg[0], pos, seg[2]])=0D
+ else:=0D
+ raise Exception("Could not find '%s' in binary !"=0D
+ % seg[0])=0D
+=0D
+ return bin_segs=0D
+=0D
+ def extract_cfg_from_bin(self, bin_data):=0D
+ # get cfg bin length=0D
+ cfg_bins =3D bytearray()=0D
+ bin_segs =3D self.get_bin_segment(bin_data)=0D
+ for each in bin_segs:=0D
+ cfg_bins.extend(bin_data[each[1]:each[1] + each[2]])=0D
+ return cfg_bins=0D
+=0D
+ def save_current_to_bin(self):=0D
+ cfg_bins =3D self.generate_binary_array()=0D
+ if self._old_bin is None:=0D
+ return cfg_bins=0D
+=0D
+ bin_data =3D bytearray(self._old_bin)=0D
+ bin_segs =3D self.get_bin_segment(self._old_bin)=0D
+ cfg_off =3D 0=0D
+ for each in bin_segs:=0D
+ length =3D each[2]=0D
+ bin_data[each[1]:each[1] + length] =3D cfg_bins[cfg_off:=0D
+ cfg_off=0D
+ + length]=0D
+ cfg_off +=3D length=0D
+ print('Patched the loaded binary successfully !')=0D
+=0D
+ return bin_data=0D
+=0D
+ def load_default_from_bin(self, bin_data):=0D
+ self._old_bin =3D bin_data=0D
+ cfg_bins =3D self.extract_cfg_from_bin(bin_data)=0D
+ self.set_field_value(self._cfg_tree, cfg_bins, True)=0D
+ return cfg_bins=0D
+=0D
+ def generate_binary_array(self, path=3D''):=0D
+ if path =3D=3D '':=0D
+ top =3D None=0D
+ else:=0D
+ top =3D self.locate_cfg_item(path)=0D
+ if not top:=0D
+ raise Exception("Invalid configuration path '%s' !"=0D
+ % path)=0D
+ return self.get_field_value(top)=0D
+=0D
+ def generate_binary(self, bin_file_name, path=3D''):=0D
+ bin_file =3D open(bin_file_name, "wb")=0D
+ bin_file.write(self.generate_binary_array(path))=0D
+ bin_file.close()=0D
+ return 0=0D
+=0D
+ def write_delta_file(self, out_file, platform_id, out_lines):=0D
+ dlt_fd =3D open(out_file, "w")=0D
+ dlt_fd.write("%s\n" % get_copyright_header('dlt', True))=0D
+ if platform_id is not None:=0D
+ dlt_fd.write('#\n')=0D
+ dlt_fd.write('# Delta configuration values for '=0D
+ 'platform ID 0x%04X\n'=0D
+ % platform_id)=0D
+ dlt_fd.write('#\n\n')=0D
+ for line in out_lines:=0D
+ dlt_fd.write('%s\n' % line)=0D
+ dlt_fd.close()=0D
+=0D
+ def override_default_value(self, dlt_file):=0D
+ error =3D 0=0D
+ dlt_lines =3D CGenYamlCfg.expand_include_files(dlt_file)=0D
+=0D
+ platform_id =3D None=0D
+ for line, file_path, line_num in dlt_lines:=0D
+ line =3D line.strip()=0D
+ if not line or line.startswith('#'):=0D
+ continue=0D
+ match =3D re.match("\\s*([\\w\\.]+)\\s*\\|\\s*(.+)", line)=0D
+ if not match:=0D
+ raise Exception("Unrecognized line '%s' "=0D
+ "(File:'%s' Line:%d) !"=0D
+ % (line, file_path, line_num + 1))=0D
+=0D
+ path =3D match.group(1)=0D
+ value_str =3D match.group(2)=0D
+ top =3D self.locate_cfg_item(path)=0D
+ if not top:=0D
+ raise Exception(=0D
+ "Invalid configuration '%s' (File:'%s' Line:%d) !" %=0D
+ (path, file_path, line_num + 1))=0D
+=0D
+ if 'indx' in top:=0D
+ act_cfg =3D self.get_item_by_index(top['indx'])=0D
+ bit_len =3D act_cfg['length']=0D
+ else:=0D
+ struct_info =3D top[CGenYamlCfg.STRUCT]=0D
+ bit_len =3D struct_info['length']=0D
+=0D
+ value_bytes =3D self.parse_value(value_str, bit_len)=0D
+ self.set_field_value(top, value_bytes, True)=0D
+=0D
+ if path =3D=3D 'PLATFORMID_CFG_DATA.PlatformId':=0D
+ platform_id =3D value_str=0D
+=0D
+ if platform_id is None:=0D
+ raise Exception(=0D
+ "PLATFORMID_CFG_DATA.PlatformId is missing "=0D
+ "in file '%s' !" %=0D
+ (dlt_file))=0D
+=0D
+ return error=0D
+=0D
+ def generate_delta_file_from_bin(self, delta_file, old_data,=0D
+ new_data, full=3DFalse):=0D
+ new_data =3D self.load_default_from_bin(new_data)=0D
+ lines =3D []=0D
+ platform_id =3D None=0D
+ def_platform_id =3D 0=0D
+=0D
+ for item in self._cfg_list:=0D
+ if not full and (item['type'] in ['Reserved']):=0D
+ continue=0D
+ old_val =3D get_bits_from_bytes(old_data, item['offset'],=0D
+ item['length'])=0D
+ new_val =3D get_bits_from_bytes(new_data, item['offset'],=0D
+ item['length'])=0D
+=0D
+ full_name =3D item['path']=0D
+ if 'PLATFORMID_CFG_DATA.PlatformId' =3D=3D full_name:=0D
+ def_platform_id =3D old_val=0D
+ if new_val !=3D old_val or full:=0D
+ val_str =3D self.reformat_value_str(item['value'],=0D
+ item['length'])=0D
+ text =3D '%-40s | %s' % (full_name, val_str)=0D
+ lines.append(text)=0D
+=0D
+ if self.get_mode() !=3D 'FSP':=0D
+ if platform_id is None or def_platform_id =3D=3D platform_id:=
=0D
+ platform_id =3D def_platform_id=0D
+ print("WARNING: 'PlatformId' configuration is "=0D
+ "same as default %d!" % platform_id)=0D
+=0D
+ lines.insert(0, '%-40s | %s\n\n' %=0D
+ ('PLATFORMID_CFG_DATA.PlatformId',=0D
+ '0x%04X' % platform_id))=0D
+ else:=0D
+ platform_id =3D None=0D
+=0D
+ self.write_delta_file(delta_file, platform_id, lines)=0D
+=0D
+ return 0=0D
+=0D
+ def generate_delta_file(self, delta_file, bin_file, bin_file2, full=3D=
False):=0D
+ fd =3D open(bin_file, 'rb')=0D
+ new_data =3D self.extract_cfg_from_bin(bytearray(fd.read()))=0D
+ fd.close()=0D
+=0D
+ if bin_file2 =3D=3D '':=0D
+ old_data =3D self.generate_binary_array()=0D
+ else:=0D
+ old_data =3D new_data=0D
+ fd =3D open(bin_file2, 'rb')=0D
+ new_data =3D self.extract_cfg_from_bin(bytearray(fd.read()))=0D
+ fd.close()=0D
+=0D
+ return self.generate_delta_file_from_bin(delta_file,=0D
+ old_data, new_data, full)=
=0D
+=0D
+ def prepare_marshal(self, is_save):=0D
+ if is_save:=0D
+ # Ordered dict is not marshallable, convert to list=0D
+ self._cfg_tree =3D CGenYamlCfg.deep_convert_dict(self._cfg_tre=
e)=0D
+ else:=0D
+ # Revert it back=0D
+ self._cfg_tree =3D CGenYamlCfg.deep_convert_list(self._cfg_tre=
e)=0D
+=0D
+ def generate_yml_file(self, in_file, out_file):=0D
+ cfg_yaml =3D CFG_YAML()=0D
+ text =3D cfg_yaml.expand_yaml(in_file)=0D
+ yml_fd =3D open(out_file, "w")=0D
+ yml_fd.write(text)=0D
+ yml_fd.close()=0D
+ return 0=0D
+=0D
+ def write_cfg_header_file(self, hdr_file_name, tag_mode,=0D
+ tag_dict, struct_list):=0D
+ lines =3D []=0D
+ lines.append('\n\n')=0D
+ if self.get_mode() =3D=3D 'FSP':=0D
+ lines.append('#include <FspUpd.h>\n')=0D
+=0D
+ tag_mode =3D tag_mode & 0x7F=0D
+ tag_list =3D sorted(list(tag_dict.items()), key=3Dlambda x: x[1])=
=0D
+ for tagname, tagval in tag_list:=0D
+ if (tag_mode =3D=3D 0 and tagval >=3D 0x100) or \=0D
+ (tag_mode =3D=3D 1 and tagval < 0x100):=0D
+ continue=0D
+ lines.append('#define %-30s 0x%03X\n' % (=0D
+ 'CDATA_%s_TAG' % tagname[:-9], tagval))=0D
+ lines.append('\n\n')=0D
+=0D
+ name_dict =3D {}=0D
+ new_dict =3D {}=0D
+ for each in struct_list:=0D
+ if (tag_mode =3D=3D 0 and each['tag'] >=3D 0x100) or \=0D
+ (tag_mode =3D=3D 1 and each['tag'] < 0x100):=0D
+ continue=0D
+ new_dict[each['name']] =3D (each['alias'], each['count'])=0D
+ if each['alias'] not in name_dict:=0D
+ name_dict[each['alias']] =3D 1=0D
+ lines.extend(self.create_struct(each['alias'],=0D
+ each['node'], new_dict))=0D
+ lines.append('#pragma pack()\n\n')=0D
+=0D
+ self.write_header_file(lines, hdr_file_name)=0D
+=0D
+ def write_header_file(self, txt_body, file_name, type=3D'h'):=0D
+ file_name_def =3D os.path.basename(file_name).replace('.', '_')=0D
+ file_name_def =3D re.sub('(.)([A-Z][a-z]+)', r'\1_\2', file_name_d=
ef)=0D
+ file_name_def =3D re.sub('([a-z0-9])([A-Z])', r'\1_\2',=0D
+ file_name_def).upper()=0D
+=0D
+ lines =3D []=0D
+ lines.append("%s\n" % get_copyright_header(type))=0D
+ lines.append("#ifndef __%s__\n" % file_name_def)=0D
+ lines.append("#define __%s__\n\n" % file_name_def)=0D
+ if type =3D=3D 'h':=0D
+ lines.append("#pragma pack(1)\n\n")=0D
+ lines.extend(txt_body)=0D
+ if type =3D=3D 'h':=0D
+ lines.append("#pragma pack()\n\n")=0D
+ lines.append("#endif\n")=0D
+=0D
+ # Don't rewrite if the contents are the same=0D
+ create =3D True=0D
+ if os.path.exists(file_name):=0D
+ hdr_file =3D open(file_name, "r")=0D
+ org_txt =3D hdr_file.read()=0D
+ hdr_file.close()=0D
+=0D
+ new_txt =3D ''.join(lines)=0D
+ if org_txt =3D=3D new_txt:=0D
+ create =3D False=0D
+=0D
+ if create:=0D
+ hdr_file =3D open(file_name, "w")=0D
+ hdr_file.write(''.join(lines))=0D
+ hdr_file.close()=0D
+=0D
+ def generate_data_inc_file(self, dat_inc_file_name, bin_file=3DNone):=
=0D
+ # Put a prefix GUID before CFGDATA so that it can be located later=
on=0D
+ prefix =3D b'\xa7\xbd\x7f\x73\x20\x1e\x46\xd6\=0D
+xbe\x8f\x64\x12\x05\x8d\x0a\xa8'=0D
+ if bin_file:=0D
+ fin =3D open(bin_file, 'rb')=0D
+ bin_dat =3D prefix + bytearray(fin.read())=0D
+ fin.close()=0D
+ else:=0D
+ bin_dat =3D prefix + self.generate_binary_array()=0D
+=0D
+ file_name =3D os.path.basename(dat_inc_file_name).upper()=0D
+ file_name =3D file_name.replace('.', '_')=0D
+=0D
+ txt_lines =3D []=0D
+=0D
+ txt_lines.append("UINT8 mConfigDataBlob[%d] =3D {\n" % len(bin_da=
t))=0D
+ count =3D 0=0D
+ line =3D [' ']=0D
+ for each in bin_dat:=0D
+ line.append('0x%02X, ' % each)=0D
+ count =3D count + 1=0D
+ if (count & 0x0F) =3D=3D 0:=0D
+ line.append('\n')=0D
+ txt_lines.append(''.join(line))=0D
+ line =3D [' ']=0D
+ if len(line) > 1:=0D
+ txt_lines.append(''.join(line) + '\n')=0D
+=0D
+ txt_lines.append("};\n\n")=0D
+ self.write_header_file(txt_lines, dat_inc_file_name, 'inc')=0D
+=0D
+ return 0=0D
+=0D
+ def get_struct_array_info(self, input):=0D
+ parts =3D input.split(':')=0D
+ if len(parts) > 1:=0D
+ var =3D parts[1]=0D
+ input =3D parts[0]=0D
+ else:=0D
+ var =3D ''=0D
+ array_str =3D input.split('[')=0D
+ name =3D array_str[0]=0D
+ if len(array_str) > 1:=0D
+ num_str =3D ''.join(c for c in array_str[-1] if c.isdigit())=0D
+ num_str =3D '1000' if len(num_str) =3D=3D 0 else num_str=0D
+ array_num =3D int(num_str)=0D
+ else:=0D
+ array_num =3D 0=0D
+ return name, array_num, var=0D
+=0D
+ def process_multilines(self, string, max_char_length):=0D
+ multilines =3D ''=0D
+ string_length =3D len(string)=0D
+ current_string_start =3D 0=0D
+ string_offset =3D 0=0D
+ break_line_dict =3D []=0D
+ if len(string) <=3D max_char_length:=0D
+ while (string_offset < string_length):=0D
+ if string_offset >=3D 1:=0D
+ if string[string_offset - 1] =3D=3D '\\' and string[=0D
+ string_offset] =3D=3D 'n':=0D
+ break_line_dict.append(string_offset + 1)=0D
+ string_offset +=3D 1=0D
+ if break_line_dict !=3D []:=0D
+ for each in break_line_dict:=0D
+ multilines +=3D " %s\n" % string[=0D
+ current_string_start:each].lstrip()=0D
+ current_string_start =3D each=0D
+ if string_length - current_string_start > 0:=0D
+ multilines +=3D " %s\n" % string[=0D
+ current_string_start:].lstrip()=0D
+ else:=0D
+ multilines =3D " %s\n" % string=0D
+ else:=0D
+ new_line_start =3D 0=0D
+ new_line_count =3D 0=0D
+ found_space_char =3D False=0D
+ while (string_offset < string_length):=0D
+ if string_offset >=3D 1:=0D
+ if new_line_count >=3D max_char_length - 1:=0D
+ if string[string_offset] =3D=3D ' ' and \=0D
+ string_length - string_offset > 10:=0D
+ break_line_dict.append(new_line_start=0D
+ + new_line_count)=0D
+ new_line_start =3D new_line_start + new_line_c=
ount=0D
+ new_line_count =3D 0=0D
+ found_space_char =3D True=0D
+ elif string_offset =3D=3D string_length - 1 and \=
=0D
+ found_space_char is False:=0D
+ break_line_dict.append(0)=0D
+ if string[string_offset - 1] =3D=3D '\\' and string[=0D
+ string_offset] =3D=3D 'n':=0D
+ break_line_dict.append(string_offset + 1)=0D
+ new_line_start =3D string_offset + 1=0D
+ new_line_count =3D 0=0D
+ string_offset +=3D 1=0D
+ new_line_count +=3D 1=0D
+ if break_line_dict !=3D []:=0D
+ break_line_dict.sort()=0D
+ for each in break_line_dict:=0D
+ if each > 0:=0D
+ multilines +=3D " %s\n" % string[=0D
+ current_string_start:each].lstrip()=0D
+ current_string_start =3D each=0D
+ if string_length - current_string_start > 0:=0D
+ multilines +=3D " %s\n" % \=0D
+ string[current_string_start:].lstrip()=0D
+ return multilines=0D
+=0D
+ def create_field(self, item, name, length, offset, struct,=0D
+ bsf_name, help, option, bits_length=3DNone):=0D
+ pos_name =3D 28=0D
+ name_line =3D ''=0D
+ # help_line =3D ''=0D
+ # option_line =3D ''=0D
+=0D
+ if length =3D=3D 0 and name =3D=3D 'dummy':=0D
+ return '\n'=0D
+=0D
+ if bits_length =3D=3D 0:=0D
+ return '\n'=0D
+=0D
+ is_array =3D False=0D
+ if length in [1, 2, 4, 8]:=0D
+ type =3D "UINT%d" % (length * 8)=0D
+ else:=0D
+ is_array =3D True=0D
+ type =3D "UINT8"=0D
+=0D
+ if item and item['value'].startswith('{'):=0D
+ type =3D "UINT8"=0D
+ is_array =3D True=0D
+=0D
+ if struct !=3D '':=0D
+ struct_base =3D struct.rstrip('*')=0D
+ name =3D '*' * (len(struct) - len(struct_base)) + name=0D
+ struct =3D struct_base=0D
+ type =3D struct=0D
+ if struct in ['UINT8', 'UINT16', 'UINT32', 'UINT64']:=0D
+ is_array =3D True=0D
+ unit =3D int(type[4:]) // 8=0D
+ length =3D length / unit=0D
+ else:=0D
+ is_array =3D False=0D
+=0D
+ if is_array:=0D
+ name =3D name + '[%d]' % length=0D
+=0D
+ if len(type) < pos_name:=0D
+ space1 =3D pos_name - len(type)=0D
+ else:=0D
+ space1 =3D 1=0D
+=0D
+ if bsf_name !=3D '':=0D
+ name_line =3D " %s\n" % bsf_name=0D
+ else:=0D
+ name_line =3D "N/A\n"=0D
+=0D
+ # if help !=3D '':=0D
+ # help_line =3D self.process_multilines(help, 80)=0D
+=0D
+ # if option !=3D '':=0D
+ # option_line =3D self.process_multilines(option, 80)=0D
+=0D
+ if offset is None:=0D
+ offset_str =3D '????'=0D
+ else:=0D
+ offset_str =3D '0x%04X' % offset=0D
+=0D
+ if bits_length is None:=0D
+ bits_length =3D ''=0D
+ else:=0D
+ bits_length =3D ' : %d' % bits_length=0D
+=0D
+ # return "\n/** %s%s%s**/\n %s%s%s%s;\n" % (name_line, help_line,=
=0D
+ # option_line, type, ' ' * space1, name, bits_length)=0D
+ return "\n /* Offset %s: %s */\n %s%s%s%s;\n" % (=0D
+ offset_str, name_line.strip(), type, ' ' * space1,=0D
+ name, bits_length)=0D
+=0D
+ def create_struct(self, cname, top, struct_dict):=0D
+ index =3D 0=0D
+ last =3D ''=0D
+ lines =3D []=0D
+ off_base =3D -1=0D
+=0D
+ if cname in struct_dict:=0D
+ if struct_dict[cname][2]:=0D
+ return []=0D
+ lines.append('\ntypedef struct {\n')=0D
+ for field in top:=0D
+ if field[0] =3D=3D '$':=0D
+ continue=0D
+=0D
+ index +=3D 1=0D
+=0D
+ t_item =3D top[field]=0D
+ if 'indx' not in t_item:=0D
+ if CGenYamlCfg.STRUCT not in top[field]:=0D
+ continue=0D
+=0D
+ if struct_dict[field][1] =3D=3D 0:=0D
+ continue=0D
+=0D
+ append =3D True=0D
+ struct_info =3D top[field][CGenYamlCfg.STRUCT]=0D
+=0D
+ if 'struct' in struct_info:=0D
+ struct, array_num, var =3D self.get_struct_array_info(=
=0D
+ struct_info['struct'])=0D
+ if array_num > 0:=0D
+ if last =3D=3D struct:=0D
+ append =3D False=0D
+ last =3D struct=0D
+ if var =3D=3D '':=0D
+ var =3D field=0D
+=0D
+ field =3D CGenYamlCfg.format_struct_field_name(=0D
+ var, struct_dict[field][1])=0D
+ else:=0D
+ struct =3D struct_dict[field][0]=0D
+ field =3D CGenYamlCfg.format_struct_field_name(=0D
+ field, struct_dict[field][1])=0D
+=0D
+ if append:=0D
+ offset =3D t_item['$STRUCT']['offset'] // 8=0D
+ if off_base =3D=3D -1:=0D
+ off_base =3D offset=0D
+ line =3D self.create_field(None, field, 0, 0, struct,=
=0D
+ '', '', '')=0D
+ lines.append(' %s' % line)=0D
+ last =3D struct=0D
+ continue=0D
+=0D
+ item =3D self.get_item_by_index(t_item['indx'])=0D
+ if item['cname'] =3D=3D 'CfgHeader' and index =3D=3D 1 or \=0D
+ (item['cname'] =3D=3D 'CondValue' and index =3D=3D 2):=0D
+ continue=0D
+=0D
+ bit_length =3D None=0D
+ length =3D (item['length'] + 7) // 8=0D
+ match =3D re.match("^(\\d+)([b|B|W|D|Q])([B|W|D|Q]?)",=0D
+ t_item['length'])=0D
+ if match and match.group(2) =3D=3D 'b':=0D
+ bit_length =3D int(match.group(1))=0D
+ if match.group(3) !=3D '':=0D
+ length =3D CGenYamlCfg.bits_width[match.group(3)] // 8=
=0D
+ else:=0D
+ length =3D 4=0D
+ offset =3D item['offset'] // 8=0D
+ if off_base =3D=3D -1:=0D
+ off_base =3D offset=0D
+ struct =3D item.get('struct', '')=0D
+ name =3D field=0D
+ prompt =3D item['name']=0D
+ help =3D item['help']=0D
+ option =3D item['option']=0D
+ line =3D self.create_field(item, name, length, offset, struct,=
=0D
+ prompt, help, option, bit_length)=0D
+ lines.append(' %s' % line)=0D
+ last =3D struct=0D
+=0D
+ lines.append('\n} %s;\n\n' % cname)=0D
+=0D
+ return lines=0D
+=0D
+ def write_fsp_sig_header_file(self, hdr_file_name):=0D
+ hdr_fd =3D open(hdr_file_name, 'w')=0D
+ hdr_fd.write("%s\n" % get_copyright_header('h'))=0D
+ hdr_fd.write("#ifndef __FSPUPD_H__\n"=0D
+ "#define __FSPUPD_H__\n\n"=0D
+ "#include <FspEas.h>\n\n"=0D
+ "#pragma pack(1)\n\n")=0D
+ lines =3D []=0D
+ for fsp_comp in 'TMS':=0D
+ top =3D self.locate_cfg_item('FSP%s_UPD' % fsp_comp)=0D
+ if not top:=0D
+ raise Exception('Could not find FSP UPD definition !')=0D
+ bins =3D self.get_field_value(top)=0D
+ lines.append("#define FSP%s_UPD_SIGNATURE"=0D
+ " 0x%016X /* '%s' */\n\n"=0D
+ % (fsp_comp, bytes_to_value(bins[:8]),=0D
+ bins[:8].decode()))=0D
+ hdr_fd.write(''.join(lines))=0D
+ hdr_fd.write("#pragma pack()\n\n"=0D
+ "#endif\n")=0D
+ hdr_fd.close()=0D
+=0D
+ def create_header_file(self, hdr_file_name, com_hdr_file_name=3D'', pa=
th=3D''):=0D
+=0D
+ def _build_header_struct(name, cfgs, level):=0D
+ if CGenYamlCfg.STRUCT in cfgs:=0D
+ if 'CfgHeader' in cfgs:=0D
+ # collect CFGDATA TAG IDs=0D
+ cfghdr =3D self.get_item_by_index(cfgs['CfgHeader']['i=
ndx'])=0D
+ tag_val =3D array_str_to_value(cfghdr['value']) >> 20=
=0D
+ tag_dict[name] =3D tag_val=0D
+ if level =3D=3D 1:=0D
+ tag_curr[0] =3D tag_val=0D
+ struct_dict[name] =3D (level, tag_curr[0], cfgs)=0D
+ if path =3D=3D 'FSP_SIG':=0D
+ self.write_fsp_sig_header_file(hdr_file_name)=0D
+ return=0D
+ tag_curr =3D [0]=0D
+ tag_dict =3D {}=0D
+ struct_dict =3D {}=0D
+=0D
+ if path =3D=3D '':=0D
+ top =3D None=0D
+ else:=0D
+ top =3D self.locate_cfg_item(path)=0D
+ if not top:=0D
+ raise Exception("Invalid configuration path '%s' !" % path=
)=0D
+ _build_header_struct(path, top, 0)=0D
+ self.traverse_cfg_tree(_build_header_struct, top)=0D
+=0D
+ if tag_curr[0] =3D=3D 0:=0D
+ hdr_mode =3D 2=0D
+ else:=0D
+ hdr_mode =3D 1=0D
+=0D
+ if re.match('FSP[TMS]_UPD', path):=0D
+ hdr_mode |=3D 0x80=0D
+=0D
+ # filter out the items to be built for tags and structures=0D
+ struct_list =3D []=0D
+ for each in struct_dict:=0D
+ match =3D False=0D
+ for check in CGenYamlCfg.exclude_struct:=0D
+ if re.match(check, each):=0D
+ match =3D True=0D
+ if each in tag_dict:=0D
+ if each not in CGenYamlCfg.include_tag:=0D
+ del tag_dict[each]=0D
+ break=0D
+ if not match:=0D
+ struct_list.append({'name': each, 'alias': '', 'count': 0,=
=0D
+ 'level': struct_dict[each][0],=0D
+ 'tag': struct_dict[each][1],=0D
+ 'node': struct_dict[each][2]})=0D
+=0D
+ # sort by level so that the bottom level struct=0D
+ # will be build first to satisfy dependencies=0D
+ struct_list =3D sorted(struct_list, key=3Dlambda x: x['level'],=0D
+ reverse=3DTrue)=0D
+=0D
+ # Convert XXX_[0-9]+ to XXX as an array hint=0D
+ for each in struct_list:=0D
+ cfgs =3D each['node']=0D
+ if 'struct' in cfgs['$STRUCT']:=0D
+ each['alias'], array_num, var =3D self.get_struct_array_in=
fo(=0D
+ cfgs['$STRUCT']['struct'])=0D
+ else:=0D
+ match =3D re.match('(\\w+)(_\\d+)', each['name'])=0D
+ if match:=0D
+ each['alias'] =3D match.group(1)=0D
+ else:=0D
+ each['alias'] =3D each['name']=0D
+=0D
+ # count items for array build=0D
+ for idx, each in enumerate(struct_list):=0D
+ if idx > 0:=0D
+ last_struct =3D struct_list[idx-1]['node']['$STRUCT']=0D
+ curr_struct =3D each['node']['$STRUCT']=0D
+ if struct_list[idx-1]['alias'] =3D=3D each['alias'] and \=
=0D
+ curr_struct['length'] =3D=3D last_struct['length'] and =
\=0D
+ curr_struct['offset'] =3D=3D last_struct['offset'] + \=
=0D
+ last_struct['length']:=0D
+ for idx2 in range(idx-1, -1, -1):=0D
+ if struct_list[idx2]['count'] > 0:=0D
+ struct_list[idx2]['count'] +=3D 1=0D
+ break=0D
+ continue=0D
+ each['count'] =3D 1=0D
+=0D
+ # generate common header=0D
+ if com_hdr_file_name:=0D
+ self.write_cfg_header_file(com_hdr_file_name, 0, tag_dict,=0D
+ struct_list)=0D
+=0D
+ # generate platform header=0D
+ self.write_cfg_header_file(hdr_file_name, hdr_mode, tag_dict,=0D
+ struct_list)=0D
+=0D
+ return 0=0D
+=0D
+ def load_yaml(self, cfg_file):=0D
+ cfg_yaml =3D CFG_YAML()=0D
+ self.initialize()=0D
+ self._cfg_tree =3D cfg_yaml.load_yaml(cfg_file)=0D
+ self._def_dict =3D cfg_yaml.def_dict=0D
+ self._yaml_path =3D os.path.dirname(cfg_file)=0D
+ self.build_cfg_list()=0D
+ self.build_var_dict()=0D
+ self.update_def_value()=0D
+ return 0=0D
+=0D
+=0D
+def usage():=0D
+ print('\n'.join([=0D
+ "GenYamlCfg Version 0.50",=0D
+ "Usage:",=0D
+ " GenYamlCfg GENINC BinFile IncOutFile "=0D
+ " [-D Macros]",=0D
+=0D
+ " GenYamlCfg GENPKL YamlFile PklOutFile "=0D
+ " [-D Macros]",=0D
+ " GenYamlCfg GENBIN YamlFile[;DltFile] BinOutFile "=0D
+ " [-D Macros]",=0D
+ " GenYamlCfg GENDLT YamlFile[;BinFile] DltOutFile "=0D
+ " [-D Macros]",=0D
+ " GenYamlCfg GENYML YamlFile YamlOutFile"=0D
+ " [-D Macros]",=0D
+ " GenYamlCfg GENHDR YamlFile HdrOutFile "=0D
+ " [-D Macros]"=0D
+ ]))=0D
+=0D
+=0D
+def main():=0D
+ # Parse the options and args=0D
+ argc =3D len(sys.argv)=0D
+ if argc < 4:=0D
+ usage()=0D
+ return 1=0D
+=0D
+ gen_cfg_data =3D CGenYamlCfg()=0D
+ command =3D sys.argv[1].upper()=0D
+ out_file =3D sys.argv[3]=0D
+ if argc >=3D 5 and gen_cfg_data.parse_macros(sys.argv[4:]) !=3D 0:=0D
+ raise Exception("ERROR: Macro parsing failed !")=0D
+=0D
+ file_list =3D sys.argv[2].split(';')=0D
+ if len(file_list) >=3D 2:=0D
+ yml_file =3D file_list[0]=0D
+ dlt_file =3D file_list[1]=0D
+ elif len(file_list) =3D=3D 1:=0D
+ yml_file =3D file_list[0]=0D
+ dlt_file =3D ''=0D
+ else:=0D
+ raise Exception("ERROR: Invalid parameter '%s' !" % sys.argv[2])=0D
+ yml_scope =3D ''=0D
+ if '@' in yml_file:=0D
+ parts =3D yml_file.split('@')=0D
+ yml_file =3D parts[0]=0D
+ yml_scope =3D parts[1]=0D
+=0D
+ if command =3D=3D "GENDLT" and yml_file.endswith('.dlt'):=0D
+ # It needs to expand an existing DLT file=0D
+ dlt_file =3D yml_file=0D
+ lines =3D gen_cfg_data.expand_include_files(dlt_file)=0D
+ write_lines(lines, out_file)=0D
+ return 0=0D
+=0D
+ if command =3D=3D "GENYML":=0D
+ if not yml_file.lower().endswith('.yaml'):=0D
+ raise Exception('Only YAML file is supported !')=0D
+ gen_cfg_data.generate_yml_file(yml_file, out_file)=0D
+ return 0=0D
+=0D
+ bin_file =3D ''=0D
+ if (yml_file.lower().endswith('.bin')) and (command =3D=3D "GENINC"):=
=0D
+ # It is binary file=0D
+ bin_file =3D yml_file=0D
+ yml_file =3D ''=0D
+=0D
+ if bin_file:=0D
+ gen_cfg_data.generate_data_inc_file(out_file, bin_file)=0D
+ return 0=0D
+=0D
+ cfg_bin_file =3D ''=0D
+ cfg_bin_file2 =3D ''=0D
+ if dlt_file:=0D
+ if command =3D=3D "GENDLT":=0D
+ cfg_bin_file =3D dlt_file=0D
+ dlt_file =3D ''=0D
+ if len(file_list) >=3D 3:=0D
+ cfg_bin_file2 =3D file_list[2]=0D
+=0D
+ if yml_file.lower().endswith('.pkl'):=0D
+ with open(yml_file, "rb") as pkl_file:=0D
+ gen_cfg_data.__dict__ =3D marshal.load(pkl_file)=0D
+ gen_cfg_data.prepare_marshal(False)=0D
+=0D
+ # Override macro definition again for Pickle file=0D
+ if argc >=3D 5:=0D
+ gen_cfg_data.parse_macros(sys.argv[4:])=0D
+ else:=0D
+ gen_cfg_data.load_yaml(yml_file)=0D
+ if command =3D=3D 'GENPKL':=0D
+ gen_cfg_data.prepare_marshal(True)=0D
+ with open(out_file, "wb") as pkl_file:=0D
+ marshal.dump(gen_cfg_data.__dict__, pkl_file)=0D
+ json_file =3D os.path.splitext(out_file)[0] + '.json'=0D
+ fo =3D open(json_file, 'w')=0D
+ path_list =3D []=0D
+ cfgs =3D {'_cfg_page': gen_cfg_data._cfg_page,=0D
+ '_cfg_list': gen_cfg_data._cfg_list,=0D
+ '_path_list': path_list}=0D
+ # optimize to reduce size=0D
+ path =3D None=0D
+ for each in cfgs['_cfg_list']:=0D
+ new_path =3D each['path'][:-len(each['cname'])-1]=0D
+ if path !=3D new_path:=0D
+ path =3D new_path=0D
+ each['path'] =3D path=0D
+ path_list.append(path)=0D
+ else:=0D
+ del each['path']=0D
+ if each['order'] =3D=3D each['offset']:=0D
+ del each['order']=0D
+ del each['offset']=0D
+=0D
+ # value is just used to indicate display type=0D
+ value =3D each['value']=0D
+ if value.startswith('0x'):=0D
+ hex_len =3D ((each['length'] + 7) // 8) * 2=0D
+ if len(value) =3D=3D hex_len:=0D
+ value =3D 'x%d' % hex_len=0D
+ else:=0D
+ value =3D 'x'=0D
+ each['value'] =3D value=0D
+ elif value and value[0] in ['"', "'", '{']:=0D
+ each['value'] =3D value[0]=0D
+ else:=0D
+ del each['value']=0D
+=0D
+ fo.write(repr(cfgs))=0D
+ fo.close()=0D
+ return 0=0D
+=0D
+ if dlt_file:=0D
+ gen_cfg_data.override_default_value(dlt_file)=0D
+=0D
+ gen_cfg_data.detect_fsp()=0D
+=0D
+ if command =3D=3D "GENBIN":=0D
+ if len(file_list) =3D=3D 3:=0D
+ old_data =3D gen_cfg_data.generate_binary_array()=0D
+ fi =3D open(file_list[2], 'rb')=0D
+ new_data =3D bytearray(fi.read())=0D
+ fi.close()=0D
+ if len(new_data) !=3D len(old_data):=0D
+ raise Exception("Binary file '%s' length does not match, \=
=0D
+ignored !" % file_list[2])=0D
+ else:=0D
+ gen_cfg_data.load_default_from_bin(new_data)=0D
+ gen_cfg_data.override_default_value(dlt_file)=0D
+=0D
+ gen_cfg_data.generate_binary(out_file, yml_scope)=0D
+=0D
+ elif command =3D=3D "GENDLT":=0D
+ full =3D True if 'FULL' in gen_cfg_data._macro_dict else False=0D
+ gen_cfg_data.generate_delta_file(out_file, cfg_bin_file,=0D
+ cfg_bin_file2, full)=0D
+=0D
+ elif command =3D=3D "GENHDR":=0D
+ out_files =3D out_file.split(';')=0D
+ brd_out_file =3D out_files[0].strip()=0D
+ if len(out_files) > 1:=0D
+ com_out_file =3D out_files[1].strip()=0D
+ else:=0D
+ com_out_file =3D ''=0D
+ gen_cfg_data.create_header_file(brd_out_file, com_out_file, yml_sc=
ope)=0D
+=0D
+ elif command =3D=3D "GENINC":=0D
+ gen_cfg_data.generate_data_inc_file(out_file)=0D
+=0D
+ elif command =3D=3D "DEBUG":=0D
+ gen_cfg_data.print_cfgs()=0D
+=0D
+ else:=0D
+ raise Exception("Unsuported command '%s' !" % command)=0D
+=0D
+ return 0=0D
+=0D
+=0D
+if __name__ =3D=3D '__main__':=0D
+ sys.exit(main())=0D
diff --git a/IntelFsp2Pkg/Tools/ConfigEditor/SingleSign.py b/IntelFsp2Pkg/T=
ools/ConfigEditor/SingleSign.py
new file mode 100644
index 0000000000..7e008aa68a
--- /dev/null
+++ b/IntelFsp2Pkg/Tools/ConfigEditor/SingleSign.py
@@ -0,0 +1,324 @@
+#!/usr/bin/env python=0D
+# @ SingleSign.py=0D
+# Single signing script=0D
+#=0D
+# Copyright (c) 2020 - 2021, Intel Corporation. All rights reserved.<BR>=0D
+# SPDX-License-Identifier: BSD-2-Clause-Patent=0D
+#=0D
+##=0D
+=0D
+import os=0D
+import sys=0D
+import re=0D
+import shutil=0D
+import subprocess=0D
+=0D
+SIGNING_KEY =3D {=0D
+ # Key Id | Key File Name start |=0D
+ # =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=0D
+ # KEY_ID_MASTER is used for signing Slimboot Key Hash Manifest \=0D
+ # container (KEYH Component)=0D
+ "KEY_ID_MASTER_RSA2048": "MasterTestKey_Priv_RSA2048.pem",=0D
+ "KEY_ID_MASTER_RSA3072": "MasterTestKey_Priv_RSA3072.pem",=0D
+=0D
+ # KEY_ID_CFGDATA is used for signing external Config data blob)=0D
+ "KEY_ID_CFGDATA_RSA2048": "ConfigTestKey_Priv_RSA2048.pem",=0D
+ "KEY_ID_CFGDATA_RSA3072": "ConfigTestKey_Priv_RSA3072.pem",=0D
+=0D
+ # KEY_ID_FIRMWAREUPDATE is used for signing capsule firmware update im=
age)=0D
+ "KEY_ID_FIRMWAREUPDATE_RSA2048": "FirmwareUpdateTestKey_Priv_RSA2048.p=
em",=0D
+ "KEY_ID_FIRMWAREUPDATE_RSA3072": "FirmwareUpdateTestKey_Priv_RSA3072.p=
em",=0D
+=0D
+ # KEY_ID_CONTAINER is used for signing container header with mono sign=
ature=0D
+ "KEY_ID_CONTAINER_RSA2048": "ContainerTestKey_Priv_RSA2048.pem",=0D
+ "KEY_ID_CONTAINER_RSA3072": "ContainerTestKey_Priv_RSA3072.pem",=0D
+=0D
+ # CONTAINER_COMP1_KEY_ID is used for signing container components=0D
+ "KEY_ID_CONTAINER_COMP_RSA2048": "ContainerCompTestKey_Priv_RSA2048.pe=
m",=0D
+ "KEY_ID_CONTAINER_COMP_RSA3072": "ContainerCompTestKey_Priv_RSA3072.pe=
m",=0D
+=0D
+ # KEY_ID_OS1_PUBLIC, KEY_ID_OS2_PUBLIC is used for referencing \=0D
+ # Boot OS public keys=0D
+ "KEY_ID_OS1_PUBLIC_RSA2048": "OS1_TestKey_Pub_RSA2048.pem",=0D
+ "KEY_ID_OS1_PUBLIC_RSA3072": "OS1_TestKey_Pub_RSA3072.pem",=0D
+=0D
+ "KEY_ID_OS2_PUBLIC_RSA2048": "OS2_TestKey_Pub_RSA2048.pem",=0D
+ "KEY_ID_OS2_PUBLIC_RSA3072": "OS2_TestKey_Pub_RSA3072.pem",=0D
+=0D
+ }=0D
+=0D
+MESSAGE_SBL_KEY_DIR =3D """!!! PRE-REQUISITE: Path to SBL_KEY_DIR has.=0D
+to be set with SBL KEYS DIRECTORY !!! \n!!! Generate keys.=0D
+using GenerateKeys.py available in BootloaderCorePkg/Tools.=0D
+directory !!! \n !!! Run $python.=0D
+BootloaderCorePkg/Tools/GenerateKeys.py -k $PATH_TO_SBL_KEY_DIR !!!\n=0D
+!!! Set SBL_KEY_DIR environ with path to SBL KEYS DIR !!!\n"=0D
+!!! Windows $set SBL_KEY_DIR=3D$PATH_TO_SBL_KEY_DIR !!!\n=0D
+!!! Linux $export SBL_KEY_DIR=3D$PATH_TO_SBL_KEY_DIR !!!\n"""=0D
+=0D
+=0D
+def get_openssl_path():=0D
+ if os.name =3D=3D 'nt':=0D
+ if 'OPENSSL_PATH' not in os.environ:=0D
+ openssl_dir =3D "C:\\Openssl\\bin\\"=0D
+ if os.path.exists(openssl_dir):=0D
+ os.environ['OPENSSL_PATH'] =3D openssl_dir=0D
+ else:=0D
+ os.environ['OPENSSL_PATH'] =3D "C:\\Openssl\\"=0D
+ if 'OPENSSL_CONF' not in os.environ:=0D
+ openssl_cfg =3D "C:\\Openssl\\openssl.cfg"=0D
+ if os.path.exists(openssl_cfg):=0D
+ os.environ['OPENSSL_CONF'] =3D openssl_cfg=0D
+ openssl =3D os.path.join(=0D
+ os.environ.get('OPENSSL_PATH', ''),=0D
+ 'openssl.exe')=0D
+ else:=0D
+ # Get openssl path for Linux cases=0D
+ openssl =3D shutil.which('openssl')=0D
+=0D
+ return openssl=0D
+=0D
+=0D
+def run_process(arg_list, print_cmd=3DFalse, capture_out=3DFalse):=0D
+ sys.stdout.flush()=0D
+ if print_cmd:=0D
+ print(' '.join(arg_list))=0D
+=0D
+ exc =3D None=0D
+ result =3D 0=0D
+ output =3D ''=0D
+ try:=0D
+ if capture_out:=0D
+ output =3D subprocess.check_output(arg_list).decode()=0D
+ else:=0D
+ result =3D subprocess.call(arg_list)=0D
+ except Exception as ex:=0D
+ result =3D 1=0D
+ exc =3D ex=0D
+=0D
+ if result:=0D
+ if not print_cmd:=0D
+ print('Error in running process:\n %s' % ' '.join(arg_list))=
=0D
+ if exc is None:=0D
+ sys.exit(1)=0D
+ else:=0D
+ raise exc=0D
+=0D
+ return output=0D
+=0D
+=0D
+def check_file_pem_format(priv_key):=0D
+ # Check for file .pem format=0D
+ key_name =3D os.path.basename(priv_key)=0D
+ if os.path.splitext(key_name)[1] =3D=3D ".pem":=0D
+ return True=0D
+ else:=0D
+ return False=0D
+=0D
+=0D
+def get_key_id(priv_key):=0D
+ # Extract base name if path is provided.=0D
+ key_name =3D os.path.basename(priv_key)=0D
+ # Check for KEY_ID in key naming.=0D
+ if key_name.startswith('KEY_ID'):=0D
+ return key_name=0D
+ else:=0D
+ return None=0D
+=0D
+=0D
+def get_sbl_key_dir():=0D
+ # Check Key store setting SBL_KEY_DIR path=0D
+ if 'SBL_KEY_DIR' not in os.environ:=0D
+ exception_string =3D "ERROR: SBL_KEY_DIR is not defined." \=0D
+ " Set SBL_KEY_DIR with SBL Keys directory!!\n"=0D
+ raise Exception(exception_string + MESSAGE_SBL_KEY_DIR)=0D
+=0D
+ sbl_key_dir =3D os.environ.get('SBL_KEY_DIR')=0D
+ if not os.path.exists(sbl_key_dir):=0D
+ exception_string =3D "ERROR:SBL_KEY_DIR set " + sbl_key_dir \=0D
+ + " is not valid." \=0D
+ " Set the correct SBL_KEY_DIR path !!\n" \=0D
+ + MESSAGE_SBL_KEY_DIR=0D
+ raise Exception(exception_string)=0D
+ else:=0D
+ return sbl_key_dir=0D
+=0D
+=0D
+def get_key_from_store(in_key):=0D
+=0D
+ # Check in_key is path to key=0D
+ if os.path.exists(in_key):=0D
+ return in_key=0D
+=0D
+ # Get Slimboot key dir path=0D
+ sbl_key_dir =3D get_sbl_key_dir()=0D
+=0D
+ # Extract if in_key is key_id=0D
+ priv_key =3D get_key_id(in_key)=0D
+ if priv_key is not None:=0D
+ if (priv_key in SIGNING_KEY):=0D
+ # Generate key file name from key id=0D
+ priv_key_file =3D SIGNING_KEY[priv_key]=0D
+ else:=0D
+ exception_string =3D "KEY_ID" + priv_key + "is not found " \=0D
+ "is not found in supported KEY IDs!!"=0D
+ raise Exception(exception_string)=0D
+ elif check_file_pem_format(in_key):=0D
+ # check if file name is provided in pem format=0D
+ priv_key_file =3D in_key=0D
+ else:=0D
+ priv_key_file =3D None=0D
+ raise Exception('key provided %s is not valid!' % in_key)=0D
+=0D
+ # Create a file path=0D
+ # Join Key Dir and priv_key_file=0D
+ try:=0D
+ priv_key =3D os.path.join(sbl_key_dir, priv_key_file)=0D
+ except Exception:=0D
+ raise Exception('priv_key is not found %s!' % priv_key)=0D
+=0D
+ # Check for priv_key construted based on KEY ID exists in specified pa=
th=0D
+ if not os.path.isfile(priv_key):=0D
+ exception_string =3D "!!! ERROR: Key file corresponding to" \=0D
+ + in_key + "do not exist in Sbl key " \=0D
+ "directory at" + sbl_key_dir + "!!! \n" \=0D
+ + MESSAGE_SBL_KEY_DIR=0D
+ raise Exception(exception_string)=0D
+=0D
+ return priv_key=0D
+=0D
+#=0D
+# Sign an file using openssl=0D
+#=0D
+# priv_key [Input] Key Id or Path to Private key=0D
+# hash_type [Input] Signing hash=0D
+# sign_scheme[Input] Sign/padding scheme=0D
+# in_file [Input] Input file to be signed=0D
+# out_file [Input/Output] Signed data file=0D
+#=0D
+=0D
+=0D
+def single_sign_file(priv_key, hash_type, sign_scheme, in_file, out_file):=
=0D
+=0D
+ _hash_type_string =3D {=0D
+ "SHA2_256": 'sha256',=0D
+ "SHA2_384": 'sha384',=0D
+ "SHA2_512": 'sha512',=0D
+ }=0D
+=0D
+ _hash_digest_Size =3D {=0D
+ # Hash_string : Hash_Size=0D
+ "SHA2_256": 32,=0D
+ "SHA2_384": 48,=0D
+ "SHA2_512": 64,=0D
+ "SM3_256": 32,=0D
+ }=0D
+=0D
+ _sign_scheme_string =3D {=0D
+ "RSA_PKCS1": 'pkcs1',=0D
+ "RSA_PSS": 'pss',=0D
+ }=0D
+=0D
+ priv_key =3D get_key_from_store(priv_key)=0D
+=0D
+ # Temporary files to store hash generated=0D
+ hash_file_tmp =3D out_file+'.hash.tmp'=0D
+ hash_file =3D out_file+'.hash'=0D
+=0D
+ # Generate hash using openssl dgst in hex format=0D
+ cmdargs =3D [get_openssl_path(),=0D
+ 'dgst',=0D
+ '-'+'%s' % _hash_type_string[hash_type],=0D
+ '-out', '%s' % hash_file_tmp, '%s' % in_file]=0D
+ run_process(cmdargs)=0D
+=0D
+ # Extract hash form dgst command output and convert to ascii=0D
+ with open(hash_file_tmp, 'r') as fin:=0D
+ hashdata =3D fin.read()=0D
+ fin.close()=0D
+=0D
+ try:=0D
+ hashdata =3D hashdata.rsplit('=3D', 1)[1].strip()=0D
+ except Exception:=0D
+ raise Exception('Hash Data not found for signing!')=0D
+=0D
+ if len(hashdata) !=3D (_hash_digest_Size[hash_type] * 2):=0D
+ raise Exception('Hash Data size do match with for hash type!')=0D
+=0D
+ hashdata_bytes =3D bytearray.fromhex(hashdata)=0D
+ open(hash_file, 'wb').write(hashdata_bytes)=0D
+=0D
+ print("Key used for Singing %s !!" % priv_key)=0D
+=0D
+ # sign using Openssl pkeyutl=0D
+ cmdargs =3D [get_openssl_path(),=0D
+ 'pkeyutl', '-sign', '-in', '%s' % hash_file,=0D
+ '-inkey', '%s' % priv_key, '-out',=0D
+ '%s' % out_file, '-pkeyopt',=0D
+ 'digest:%s' % _hash_type_string[hash_type],=0D
+ '-pkeyopt', 'rsa_padding_mode:%s' %=0D
+ _sign_scheme_string[sign_scheme]]=0D
+=0D
+ run_process(cmdargs)=0D
+=0D
+ return=0D
+=0D
+#=0D
+# Extract public key using openssl=0D
+#=0D
+# in_key [Input] Private key or public key in pem format=0D
+# pub_key_file [Input/Output] Public Key to a file=0D
+#=0D
+# return keydata (mod, exp) in bin format=0D
+#=0D
+=0D
+=0D
+def single_sign_gen_pub_key(in_key, pub_key_file=3DNone):=0D
+=0D
+ in_key =3D get_key_from_store(in_key)=0D
+=0D
+ # Expect key to be in PEM format=0D
+ is_prv_key =3D False=0D
+ cmdline =3D [get_openssl_path(), 'rsa', '-pubout', '-text', '-noout',=
=0D
+ '-in', '%s' % in_key]=0D
+ # Check if it is public key or private key=0D
+ text =3D open(in_key, 'r').read()=0D
+ if '-BEGIN RSA PRIVATE KEY-' in text:=0D
+ is_prv_key =3D True=0D
+ elif '-BEGIN PUBLIC KEY-' in text:=0D
+ cmdline.extend(['-pubin'])=0D
+ else:=0D
+ raise Exception('Unknown key format "%s" !' % in_key)=0D
+=0D
+ if pub_key_file:=0D
+ cmdline.extend(['-out', '%s' % pub_key_file])=0D
+ capture =3D False=0D
+ else:=0D
+ capture =3D True=0D
+=0D
+ output =3D run_process(cmdline, capture_out=3Dcapture)=0D
+ if not capture:=0D
+ output =3D text =3D open(pub_key_file, 'r').read()=0D
+ data =3D output.replace('\r', '')=0D
+ data =3D data.replace('\n', '')=0D
+ data =3D data.replace(' ', '')=0D
+=0D
+ # Extract the modulus=0D
+ if is_prv_key:=0D
+ match =3D re.search('modulus(.*)publicExponent:\\s+(\\d+)\\s+', da=
ta)=0D
+ else:=0D
+ match =3D re.search('Modulus(?:.*?):(.*)Exponent:\\s+(\\d+)\\s+', =
data)=0D
+ if not match:=0D
+ raise Exception('Public key not found!')=0D
+ modulus =3D match.group(1).replace(':', '')=0D
+ exponent =3D int(match.group(2))=0D
+=0D
+ mod =3D bytearray.fromhex(modulus)=0D
+ # Remove the '00' from the front if the MSB is 1=0D
+ if mod[0] =3D=3D 0 and (mod[1] & 0x80):=0D
+ mod =3D mod[1:]=0D
+ exp =3D bytearray.fromhex('{:08x}'.format(exponent))=0D
+=0D
+ keydata =3D mod + exp=0D
+=0D
+ return keydata=0D
diff --git a/IntelFsp2Pkg/Tools/FspDscBsf2Yaml.py b/IntelFsp2Pkg/Tools/FspD=
scBsf2Yaml.py
index d2ca7145ae..c64b50404d 100644
--- a/IntelFsp2Pkg/Tools/FspDscBsf2Yaml.py
+++ b/IntelFsp2Pkg/Tools/FspDscBsf2Yaml.py
@@ -1,8 +1,7 @@
#!/usr/bin/env python=0D
-## @ FspDscBsf2Yaml.py=0D
-# This script convert DSC or BSF format file into YAML format=0D
-#=0D
-# Copyright(c) 2021, Intel Corporation. All rights reserved.<BR>=0D
+# @ FspBsf2Dsc.py=0D
+# This script convert FSP BSF format into DSC format=0D
+# Copyright (c) 2020 - 2021, Intel Corporation. All rights reserved.<BR>=0D
# SPDX-License-Identifier: BSD-2-Clause-Patent=0D
#=0D
##=0D
@@ -10,277 +9,38 @@
import os=0D
import re=0D
import sys=0D
-from datetime import date=0D
+=0D
from collections import OrderedDict=0D
-from functools import reduce=0D
+from datetime import date=0D
=0D
-from GenCfgOpt import CGenCfgOpt=0D
+from FspGenCfgData import CFspBsf2Dsc, CGenCfgData=0D
=0D
__copyright_tmp__ =3D """## @file=0D
#=0D
-# YAML CFGDATA %s File.=0D
-#=0D
-# Copyright(c) %4d, Intel Corporation. All rights reserved.<BR>=0D
-# SPDX-License-Identifier: BSD-2-Clause-Patent=0D
-#=0D
-##=0D
-"""=0D
-=0D
-__copyright_dsc__ =3D """## @file=0D
+# Slim Bootloader CFGDATA %s File.=0D
#=0D
-# Copyright (c) %04d, Intel Corporation. All rights reserved.<BR>=0D
+# Copyright (c) %4d, Intel Corporation. All rights reserved.<BR>=0D
# SPDX-License-Identifier: BSD-2-Clause-Patent=0D
#=0D
##=0D
-=0D
-[PcdsDynamicVpd.Upd]=0D
- #=0D
- # Global definitions in BSF=0D
- # !BSF BLOCK:{NAME:"FSP UPD Configuration", VER:"0.1"}=0D
- #=0D
-=0D
"""=0D
=0D
=0D
-def Bytes2Val(Bytes):=0D
- return reduce(lambda x, y: (x << 8) | y, Bytes[::-1])=0D
-=0D
-=0D
-def Str2Bytes(Value, Blen):=0D
- Result =3D bytearray(Value[1:-1], 'utf-8') # Excluding quotes=0D
- if len(Result) < Blen:=0D
- Result.extend(b'\x00' * (Blen - len(Result)))=0D
- return Result=0D
-=0D
-=0D
-class CFspBsf2Dsc:=0D
-=0D
- def __init__(self, bsf_file):=0D
- self.cfg_list =3D CFspBsf2Dsc.parse_bsf(bsf_file)=0D
-=0D
- def get_dsc_lines(self):=0D
- return CFspBsf2Dsc.generate_dsc(self.cfg_list)=0D
-=0D
- def save_dsc(self, dsc_file):=0D
- return CFspBsf2Dsc.generate_dsc(self.cfg_list, dsc_file)=0D
-=0D
- @staticmethod=0D
- def parse_bsf(bsf_file):=0D
-=0D
- fd =3D open(bsf_file, 'r')=0D
- bsf_txt =3D fd.read()=0D
- fd.close()=0D
-=0D
- find_list =3D []=0D
- regex =3D re.compile(r'\s+Find\s+"(.*?)"(.*?)^\s+\$(.*?)\s+', re.S=
| re.MULTILINE)=0D
- for match in regex.finditer(bsf_txt):=0D
- find =3D match.group(1)=0D
- name =3D match.group(3)=0D
- if not name.endswith('_Revision'):=0D
- raise Exception("Unexpected CFG item following 'Find' !")=
=0D
- find_list.append((name, find))=0D
-=0D
- idx =3D 0=0D
- count =3D 0=0D
- prefix =3D ''=0D
- chk_dict =3D {}=0D
- cfg_list =3D []=0D
- cfg_temp =3D {'find': '', 'cname': '', 'length': 0, 'value': '0', =
'type': 'Reserved',=0D
- 'embed': '', 'page': '', 'option': '', 'instance': 0}=
=0D
- regex =3D re.compile(r'^\s+(\$(.*?)|Skip)\s+(\d+)\s+bytes(\s+\$_DE=
FAULT_\s+=3D\s+(.+?))?$',=0D
- re.S | re.MULTILINE)=0D
-=0D
- for match in regex.finditer(bsf_txt):=0D
- dlen =3D int(match.group(3))=0D
- if match.group(1) =3D=3D 'Skip':=0D
- key =3D 'gPlatformFspPkgTokenSpaceGuid_BsfSkip%d' % idx=0D
- val =3D ', '.join(['%02X' % ord(i) for i in '\x00' * dlen]=
)=0D
- idx +=3D 1=0D
- option =3D '$SKIP'=0D
- else:=0D
- key =3D match.group(2)=0D
- val =3D match.group(5)=0D
- option =3D ''=0D
-=0D
- cfg_item =3D dict(cfg_temp)=0D
- finds =3D [i for i in find_list if i[0] =3D=3D key]=0D
- if len(finds) > 0:=0D
- if count >=3D 1:=0D
- # Append a dummy one=0D
- cfg_item['cname'] =3D 'Dummy'=0D
- cfg_list.append(dict(cfg_item))=0D
- cfg_list[-1]['embed'] =3D '%s:TAG_%03X:END' % (prefix,=
ord(prefix[-1]))=0D
- prefix =3D finds[0][1]=0D
- cfg_item['embed'] =3D '%s:TAG_%03X:START' % (prefix, ord(p=
refix[-1]))=0D
- cfg_item['find'] =3D prefix=0D
- cfg_item['cname'] =3D 'Signature'=0D
- cfg_item['length'] =3D len(finds[0][1])=0D
- str2byte =3D Str2Bytes("'" + finds[0][1] + "'", len(finds[=
0][1]))=0D
- cfg_item['value'] =3D '0x%X' % Bytes2Val(str2byte)=0D
- cfg_list.append(dict(cfg_item))=0D
- cfg_item =3D dict(cfg_temp)=0D
- find_list.pop(0)=0D
- count =3D 0=0D
-=0D
- cfg_item['cname'] =3D key=0D
- cfg_item['length'] =3D dlen=0D
- cfg_item['value'] =3D val=0D
- cfg_item['option'] =3D option=0D
-=0D
- if key not in chk_dict.keys():=0D
- chk_dict[key] =3D 0=0D
- else:=0D
- chk_dict[key] +=3D 1=0D
- cfg_item['instance'] =3D chk_dict[key]=0D
-=0D
- cfg_list.append(cfg_item)=0D
- count +=3D 1=0D
-=0D
- if prefix:=0D
- cfg_item =3D dict(cfg_temp)=0D
- cfg_item['cname'] =3D 'Dummy'=0D
- cfg_item['embed'] =3D '%s:%03X:END' % (prefix, ord(prefix[-1])=
)=0D
- cfg_list.append(cfg_item)=0D
-=0D
- option_dict =3D {}=0D
- selreg =3D re.compile(r'\s+Selection\s*(.+?)\s*,\s*"(.*?)"$', re.S=
| re.MULTILINE)=0D
- regex =3D re.compile(r'^List\s&(.+?)$(.+?)^EndList$', re.S | re.MU=
LTILINE)=0D
- for match in regex.finditer(bsf_txt):=0D
- key =3D match.group(1)=0D
- option_dict[key] =3D []=0D
- for select in selreg.finditer(match.group(2)):=0D
- option_dict[key].append((int(select.group(1), 0), select.g=
roup(2)))=0D
-=0D
- chk_dict =3D {}=0D
- pagereg =3D re.compile(r'^Page\s"(.*?)"$(.+?)^EndPage$', re.S | re=
.MULTILINE)=0D
- for match in pagereg.finditer(bsf_txt):=0D
- page =3D match.group(1)=0D
- for line in match.group(2).splitlines():=0D
- match =3D re.match(r'\s+(Combo|EditNum)\s\$(.+?),\s"(.*?)"=
,\s(.+?),$', line)=0D
- if match:=0D
- cname =3D match.group(2)=0D
- if cname not in chk_dict.keys():=0D
- chk_dict[cname] =3D 0=0D
- else:=0D
- chk_dict[cname] +=3D 1=0D
- instance =3D chk_dict[cname]=0D
- cfg_idxs =3D [i for i, j in enumerate(cfg_list) if j['=
cname'] =3D=3D cname and j['instance'] =3D=3D instance]=0D
- if len(cfg_idxs) !=3D 1:=0D
- raise Exception("Multiple CFG item '%s' found !" %=
cname)=0D
- cfg_item =3D cfg_list[cfg_idxs[0]]=0D
- cfg_item['page'] =3D page=0D
- cfg_item['type'] =3D match.group(1)=0D
- cfg_item['prompt'] =3D match.group(3)=0D
- cfg_item['range'] =3D None=0D
- if cfg_item['type'] =3D=3D 'Combo':=0D
- cfg_item['option'] =3D option_dict[match.group(4)[=
1:]]=0D
- elif cfg_item['type'] =3D=3D 'EditNum':=0D
- cfg_item['option'] =3D match.group(4)=0D
- match =3D re.match(r'\s+ Help\s"(.*?)"$', line)=0D
- if match:=0D
- cfg_item['help'] =3D match.group(1)=0D
-=0D
- match =3D re.match(r'\s+"Valid\srange:\s(.*)"$', line)=0D
- if match:=0D
- parts =3D match.group(1).split()=0D
- cfg_item['option'] =3D (=0D
- (int(parts[0], 0), int(parts[2], 0), cfg_item['opt=
ion']))=0D
-=0D
- return cfg_list=0D
-=0D
- @staticmethod=0D
- def generate_dsc(option_list, dsc_file=3DNone):=0D
- dsc_lines =3D []=0D
- header =3D '%s' % (__copyright_dsc__ % date.today().year)=0D
- dsc_lines.extend(header.splitlines())=0D
-=0D
- pages =3D []=0D
- for cfg_item in option_list:=0D
- if cfg_item['page'] and (cfg_item['page'] not in pages):=0D
- pages.append(cfg_item['page'])=0D
-=0D
- page_id =3D 0=0D
- for page in pages:=0D
- dsc_lines.append(' # !BSF PAGES:{PG%02X::"%s"}' % (page_id, p=
age))=0D
- page_id +=3D 1=0D
- dsc_lines.append('')=0D
-=0D
- last_page =3D ''=0D
- for option in option_list:=0D
- dsc_lines.append('')=0D
- default =3D option['value']=0D
- pos =3D option['cname'].find('_')=0D
- name =3D option['cname'][pos + 1:]=0D
-=0D
- if option['find']:=0D
- dsc_lines.append(' # !BSF FIND:{%s}' % option['find'])=0D
- dsc_lines.append('')=0D
-=0D
- if option['instance'] > 0:=0D
- name =3D name + '_%s' % option['instance']=0D
-=0D
- if option['embed']:=0D
- dsc_lines.append(' # !HDR EMBED:{%s}' % option['embed'])=
=0D
-=0D
- if option['type'] =3D=3D 'Reserved':=0D
- dsc_lines.append(' # !BSF NAME:{Reserved} TYPE:{Reserved}=
')=0D
- if option['option'] =3D=3D '$SKIP':=0D
- dsc_lines.append(' # !BSF OPTION:{$SKIP}')=0D
- else:=0D
- prompt =3D option['prompt']=0D
-=0D
- if last_page !=3D option['page']:=0D
- last_page =3D option['page']=0D
- dsc_lines.append(' # !BSF PAGE:{PG%02X}' % (pages.ind=
ex(option['page'])))=0D
-=0D
- if option['type'] =3D=3D 'Combo':=0D
- dsc_lines.append(' # !BSF NAME:{%s} TYPE:{%s}' %=0D
- (prompt, option['type']))=0D
- ops =3D []=0D
- for val, text in option['option']:=0D
- ops.append('0x%x:%s' % (val, text))=0D
- dsc_lines.append(' # !BSF OPTION:{%s}' % (', '.join(o=
ps)))=0D
- elif option['type'] =3D=3D 'EditNum':=0D
- cfg_len =3D option['length']=0D
- if ',' in default and cfg_len > 8:=0D
- dsc_lines.append(' # !BSF NAME:{%s} TYPE:{Table}'=
% (prompt))=0D
- if cfg_len > 16:=0D
- cfg_len =3D 16=0D
- ops =3D []=0D
- for i in range(cfg_len):=0D
- ops.append('%X:1:HEX' % i)=0D
- dsc_lines.append(' # !BSF OPTION:{%s}' % (', '.jo=
in(ops)))=0D
- else:=0D
- dsc_lines.append(=0D
- ' # !BSF NAME:{%s} TYPE:{%s, %s,(0x%X, 0x%X)}=
' %=0D
- (prompt, option['type'], option['option'][2],=
=0D
- option['option'][0], option['option'][1]))=0D
- dsc_lines.append(' # !BSF HELP:{%s}' % option['help'])=0D
-=0D
- if ',' in default:=0D
- default =3D '{%s}' % default=0D
- dsc_lines.append(' gCfgData.%-30s | * | 0x%04X | %s' %=0D
- (name, option['length'], default))=0D
-=0D
- if dsc_file:=0D
- fd =3D open(dsc_file, 'w')=0D
- fd.write('\n'.join(dsc_lines))=0D
- fd.close()=0D
-=0D
- return dsc_lines=0D
-=0D
-=0D
class CFspDsc2Yaml():=0D
=0D
def __init__(self):=0D
self._Hdr_key_list =3D ['EMBED', 'STRUCT']=0D
- self._Bsf_key_list =3D ['NAME', 'HELP', 'TYPE', 'PAGE', 'PAGES', '=
OPTION',=0D
- 'CONDITION', 'ORDER', 'MARKER', 'SUBT', 'FIE=
LD', 'FIND']=0D
+ self._Bsf_key_list =3D ['NAME', 'HELP', 'TYPE', 'PAGE', 'PAGES',=0D
+ 'OPTION', 'CONDITION', 'ORDER', 'MARKER',=0D
+ 'SUBT', 'FIELD', 'FIND']=0D
self.gen_cfg_data =3D None=0D
- self.cfg_reg_exp =3D re.compile(r"^([_a-zA-Z0-9$\(\)]+)\s*\|\s*(0x=
[0-9A-F]+|\*)\s*\|"=0D
- + r"\s*(\d+|0x[0-9a-fA-F]+)\s*\|\s*(=
.+)")=0D
- self.bsf_reg_exp =3D re.compile(r"(%s):{(.+?)}(?:$|\s+)" % '|'.joi=
n(self._Bsf_key_list))=0D
- self.hdr_reg_exp =3D re.compile(r"(%s):{(.+?)}" % '|'.join(self._H=
dr_key_list))=0D
+ self.cfg_reg_exp =3D re.compile(=0D
+ "^([_a-zA-Z0-9$\\(\\)]+)\\s*\\|\\s*(0x[0-9A-F]+|\\*)"=0D
+ "\\s*\\|\\s*(\\d+|0x[0-9a-fA-F]+)\\s*\\|\\s*(.+)")=0D
+ self.bsf_reg_exp =3D re.compile("(%s):{(.+?)}(?:$|\\s+)"=0D
+ % '|'.join(self._Bsf_key_list))=0D
+ self.hdr_reg_exp =3D re.compile("(%s):{(.+?)}"=0D
+ % '|'.join(self._Hdr_key_list))=0D
self.prefix =3D ''=0D
self.unused_idx =3D 0=0D
self.offset =3D 0=0D
@@ -290,15 +50,15 @@ class CFspDsc2Yaml():
"""=0D
Load and parse a DSC CFGDATA file.=0D
"""=0D
- gen_cfg_data =3D CGenCfgOpt('FSP')=0D
+ gen_cfg_data =3D CGenCfgData('FSP')=0D
if file_name.endswith('.dsc'):=0D
- # if gen_cfg_data.ParseDscFileYaml(file_name, '') !=3D 0:=0D
- if gen_cfg_data.ParseDscFile(file_name, '') !=3D 0:=0D
+ if gen_cfg_data.ParseDscFile(file_name) !=3D 0:=0D
raise Exception('DSC file parsing error !')=0D
if gen_cfg_data.CreateVarDict() !=3D 0:=0D
raise Exception('DSC variable creation error !')=0D
else:=0D
raise Exception('Unsupported file "%s" !' % file_name)=0D
+ gen_cfg_data.UpdateDefaultValue()=0D
self.gen_cfg_data =3D gen_cfg_data=0D
=0D
def print_dsc_line(self):=0D
@@ -312,14 +72,15 @@ class CFspDsc2Yaml():
"""=0D
Format a CFGDATA item into YAML format.=0D
"""=0D
- if(not text.startswith('!expand')) and (': ' in text):=0D
+ if (not text.startswith('!expand')) and (': ' in text):=0D
tgt =3D ':' if field =3D=3D 'option' else '- '=0D
text =3D text.replace(': ', tgt)=0D
lines =3D text.splitlines()=0D
if len(lines) =3D=3D 1 and field !=3D 'help':=0D
return text=0D
else:=0D
- return '>\n ' + '\n '.join([indent + i.lstrip() for i in l=
ines])=0D
+ return '>\n ' + '\n '.join(=0D
+ [indent + i.lstrip() for i in lines])=0D
=0D
def reformat_pages(self, val):=0D
# Convert XXX:YYY into XXX::YYY format for page definition=0D
@@ -355,14 +116,16 @@ class CFspDsc2Yaml():
cfg['page'] =3D self.reformat_pages(cfg['page'])=0D
=0D
if 'struct' in cfg:=0D
- cfg['value'] =3D self.reformat_struct_value(cfg['struct'], cfg=
['value'])=0D
+ cfg['value'] =3D self.reformat_struct_value(=0D
+ cfg['struct'], cfg['value'])=0D
=0D
def parse_dsc_line(self, dsc_line, config_dict, init_dict, include):=0D
"""=0D
Parse a line in DSC and update the config dictionary accordingly.=
=0D
"""=0D
init_dict.clear()=0D
- match =3D re.match(r'g(CfgData|\w+FspPkgTokenSpaceGuid)\.(.+)', ds=
c_line)=0D
+ match =3D re.match('g(CfgData|\\w+FspPkgTokenSpaceGuid)\\.(.+)',=0D
+ dsc_line)=0D
if match:=0D
match =3D self.cfg_reg_exp.match(match.group(2))=0D
if not match:=0D
@@ -385,7 +148,7 @@ class CFspDsc2Yaml():
self.offset =3D offset + int(length, 0)=0D
return True=0D
=0D
- match =3D re.match(r"^\s*#\s+!([<>])\s+include\s+(.+)", dsc_line)=
=0D
+ match =3D re.match("^\\s*#\\s+!([<>])\\s+include\\s+(.+)", dsc_lin=
e)=0D
if match and len(config_dict) =3D=3D 0:=0D
# !include should not be inside a config field=0D
# if so, do not convert include into YAML=0D
@@ -398,7 +161,7 @@ class CFspDsc2Yaml():
config_dict['include'] =3D ''=0D
return True=0D
=0D
- match =3D re.match(r"^\s*#\s+(!BSF|!HDR)\s+(.+)", dsc_line)=0D
+ match =3D re.match("^\\s*#\\s+(!BSF|!HDR)\\s+(.+)", dsc_line)=0D
if not match:=0D
return False=0D
=0D
@@ -434,16 +197,19 @@ class CFspDsc2Yaml():
tmp_name =3D parts[0][:-5]=0D
if tmp_name =3D=3D 'CFGHDR':=0D
cfg_tag =3D '_$FFF_'=0D
- sval =3D '!expand { %s_TMPL : [ ' % tmp_name + '%s=
, %s, ' % (parts[1], cfg_tag) \=0D
- + ', '.join(parts[2:]) + ' ] }'=0D
+ sval =3D '!expand { %s_TMPL : [ ' % \=0D
+ tmp_name + '%s, %s, ' % (parts[1], cfg_tag) + =
\=0D
+ ', '.join(parts[2:]) + ' ] }'=0D
else:=0D
- sval =3D '!expand { %s_TMPL : [ ' % tmp_name + ', =
'.join(parts[1:]) + ' ] }'=0D
+ sval =3D '!expand { %s_TMPL : [ ' % \=0D
+ tmp_name + ', '.join(parts[1:]) + ' ] }'=0D
config_dict.clear()=0D
config_dict['cname'] =3D tmp_name=0D
config_dict['expand'] =3D sval=0D
return True=0D
else:=0D
- if key in ['name', 'help', 'option'] and val.startswit=
h('+'):=0D
+ if key in ['name', 'help', 'option'] and \=0D
+ val.startswith('+'):=0D
val =3D config_dict[key] + '\n' + val[1:]=0D
if val.strip() =3D=3D '':=0D
val =3D "''"=0D
@@ -493,21 +259,23 @@ class CFspDsc2Yaml():
include_file =3D ['.']=0D
=0D
for line in lines:=0D
- match =3D re.match(r"^\s*#\s+!([<>])\s+include\s+(.+)", line)=
=0D
+ match =3D re.match("^\\s*#\\s+!([<>])\\s+include\\s+(.+)", lin=
e)=0D
if match:=0D
if match.group(1) =3D=3D '<':=0D
include_file.append(match.group(2))=0D
else:=0D
include_file.pop()=0D
=0D
- match =3D re.match(r"^\s*#\s+(!BSF)\s+DEFT:{(.+?):(START|END)}=
", line)=0D
+ match =3D re.match(=0D
+ "^\\s*#\\s+(!BSF)\\s+DEFT:{(.+?):(START|END)}", line)=0D
if match:=0D
if match.group(3) =3D=3D 'START' and not template_name:=0D
template_name =3D match.group(2).strip()=0D
temp_file_dict[template_name] =3D list(include_file)=0D
bsf_temp_dict[template_name] =3D []=0D
- if match.group(3) =3D=3D 'END' and (template_name =3D=3D m=
atch.group(2).strip()) \=0D
- and template_name:=0D
+ if match.group(3) =3D=3D 'END' and \=0D
+ (template_name =3D=3D match.group(2).strip()) and =
\=0D
+ template_name:=0D
template_name =3D ''=0D
else:=0D
if template_name:=0D
@@ -531,12 +299,14 @@ class CFspDsc2Yaml():
init_dict.clear()=0D
padding_dict =3D {}=0D
cfgs.append(padding_dict)=0D
- padding_dict['cname'] =3D 'UnusedUpdSpace%d' % self.un=
used_idx=0D
+ padding_dict['cname'] =3D 'UnusedUpdSpace%d' % \=0D
+ self.unused_idx=0D
padding_dict['length'] =3D '0x%x' % num=0D
padding_dict['value'] =3D '{ 0 }'=0D
self.unused_idx +=3D 1=0D
=0D
- if cfgs and cfgs[-1]['cname'][0] !=3D '@' and config_dict[=
'cname'][0] =3D=3D '@':=0D
+ if cfgs and cfgs[-1]['cname'][0] !=3D '@' and \=0D
+ config_dict['cname'][0] =3D=3D '@':=0D
# it is a bit field, mark the previous one as virtual=
=0D
cname =3D cfgs[-1]['cname']=0D
new_cfg =3D dict(cfgs[-1])=0D
@@ -545,7 +315,8 @@ class CFspDsc2Yaml():
cfgs[-1]['cname'] =3D cname=0D
cfgs.append(new_cfg)=0D
=0D
- if cfgs and cfgs[-1]['cname'] =3D=3D 'CFGHDR' and config_d=
ict['cname'][0] =3D=3D '<':=0D
+ if cfgs and cfgs[-1]['cname'] =3D=3D 'CFGHDR' and \=0D
+ config_dict['cname'][0] =3D=3D '<':=0D
# swap CfgHeader and the CFG_DATA order=0D
if ':' in config_dict['cname']:=0D
# replace the real TAG for CFG_DATA=0D
@@ -661,7 +432,7 @@ class CFspDsc2Yaml():
lines =3D []=0D
for each in self.gen_cfg_data._MacroDict:=0D
key, value =3D self.variable_fixup(each)=0D
- lines.append('%-30s : %s' % (key, value))=0D
+ lines.append('%-30s : %s' % (key, value))=0D
return lines=0D
=0D
def output_template(self):=0D
@@ -671,7 +442,8 @@ class CFspDsc2Yaml():
self.offset =3D 0=0D
self.base_offset =3D 0=0D
start, end =3D self.get_section_range('PcdsDynamicVpd.Tmp')=0D
- bsf_temp_dict, temp_file_dict =3D self.process_template_lines(self=
.gen_cfg_data._DscLines[start:end])=0D
+ bsf_temp_dict, temp_file_dict =3D self.process_template_lines(=0D
+ self.gen_cfg_data._DscLines[start:end])=0D
template_dict =3D dict()=0D
lines =3D []=0D
file_lines =3D {}=0D
@@ -679,15 +451,18 @@ class CFspDsc2Yaml():
file_lines[last_file] =3D []=0D
=0D
for tmp_name in temp_file_dict:=0D
- temp_file_dict[tmp_name][-1] =3D self.normalize_file_name(temp=
_file_dict[tmp_name][-1], True)=0D
+ temp_file_dict[tmp_name][-1] =3D self.normalize_file_name(=0D
+ temp_file_dict[tmp_name][-1], True)=0D
if len(temp_file_dict[tmp_name]) > 1:=0D
- temp_file_dict[tmp_name][-2] =3D self.normalize_file_name(=
temp_file_dict[tmp_name][-2], True)=0D
+ temp_file_dict[tmp_name][-2] =3D self.normalize_file_name(=
=0D
+ temp_file_dict[tmp_name][-2], True)=0D
=0D
for tmp_name in bsf_temp_dict:=0D
file =3D temp_file_dict[tmp_name][-1]=0D
if last_file !=3D file and len(temp_file_dict[tmp_name]) > 1:=
=0D
inc_file =3D temp_file_dict[tmp_name][-2]=0D
- file_lines[inc_file].extend(['', '- !include %s' % temp_fi=
le_dict[tmp_name][-1], ''])=0D
+ file_lines[inc_file].extend(=0D
+ ['', '- !include %s' % temp_file_dict[tmp_name][-1], '=
'])=0D
last_file =3D file=0D
if file not in file_lines:=0D
file_lines[file] =3D []=0D
@@ -708,7 +483,8 @@ class CFspDsc2Yaml():
self.offset =3D 0=0D
self.base_offset =3D 0=0D
start, end =3D self.get_section_range('PcdsDynamicVpd.Upd')=0D
- cfgs =3D self.process_option_lines(self.gen_cfg_data._DscLines[sta=
rt:end])=0D
+ cfgs =3D self.process_option_lines(=0D
+ self.gen_cfg_data._DscLines[start:end])=0D
self.config_fixup(cfgs)=0D
file_lines =3D self.output_dict(cfgs, True)=0D
return file_lines=0D
@@ -721,13 +497,17 @@ class CFspDsc2Yaml():
level =3D 0=0D
file =3D '.'=0D
for each in cfgs:=0D
- if 'length' in each and int(each['length'], 0) =3D=3D 0:=0D
- continue=0D
+ if 'length' in each:=0D
+ if not each['length'].endswith('b') and int(each['length']=
,=0D
+ 0) =3D=3D 0:=0D
+ continue=0D
=0D
if 'include' in each:=0D
if each['include']:=0D
- each['include'] =3D self.normalize_file_name(each['inc=
lude'])=0D
- file_lines[file].extend(['', '- !include %s' % each['i=
nclude'], ''])=0D
+ each['include'] =3D self.normalize_file_name(=0D
+ each['include'])=0D
+ file_lines[file].extend(=0D
+ ['', '- !include %s' % each['include'], ''])=0D
file =3D each['include']=0D
else:=0D
file =3D '.'=0D
@@ -766,7 +546,8 @@ class CFspDsc2Yaml():
for field in each:=0D
if field in ['cname', 'expand', 'include']:=0D
continue=0D
- value_str =3D self.format_value(field, each[field], paddin=
g + ' ' * 16)=0D
+ value_str =3D self.format_value(=0D
+ field, each[field], padding + ' ' * 16)=0D
full_line =3D ' %s %-12s : %s' % (padding, field, value_=
str)=0D
lines.extend(full_line.splitlines())=0D
=0D
@@ -802,11 +583,13 @@ def dsc_to_yaml(dsc_file, yaml_file):
if file =3D=3D '.':=0D
cfgs[cfg] =3D lines=0D
else:=0D
- if('/' in file or '\\' in file):=0D
+ if ('/' in file or '\\' in file):=0D
continue=0D
file =3D os.path.basename(file)=0D
- fo =3D open(os.path.join(file), 'w')=0D
- fo.write(__copyright_tmp__ % (cfg, date.today().year) + '\=
n\n')=0D
+ out_dir =3D os.path.dirname(file)=0D
+ fo =3D open(os.path.join(out_dir, file), 'w')=0D
+ fo.write(__copyright_tmp__ % (=0D
+ cfg, date.today().year) + '\n\n')=0D
for line in lines:=0D
fo.write(line + '\n')=0D
fo.close()=0D
@@ -821,13 +604,11 @@ def dsc_to_yaml(dsc_file, yaml_file):
=0D
fo.write('\n\ntemplate:\n')=0D
for line in cfgs['Template']:=0D
- if line !=3D '':=0D
- fo.write(' ' + line + '\n')=0D
+ fo.write(' ' + line + '\n')=0D
=0D
fo.write('\n\nconfigs:\n')=0D
for line in cfgs['Option']:=0D
- if line !=3D '':=0D
- fo.write(' ' + line + '\n')=0D
+ fo.write(' ' + line + '\n')=0D
=0D
fo.close()=0D
=0D
@@ -864,7 +645,8 @@ def main():
bsf_file =3D sys.argv[1]=0D
yaml_file =3D sys.argv[2]=0D
if os.path.isdir(yaml_file):=0D
- yaml_file =3D os.path.join(yaml_file, get_fsp_name_from_path(bsf_f=
ile) + '.yaml')=0D
+ yaml_file =3D os.path.join(=0D
+ yaml_file, get_fsp_name_from_path(bsf_file) + '.yaml')=0D
=0D
if bsf_file.endswith('.dsc'):=0D
dsc_file =3D bsf_file=0D
diff --git a/IntelFsp2Pkg/Tools/FspGenCfgData.py b/IntelFsp2Pkg/Tools/FspGe=
nCfgData.py
new file mode 100644
index 0000000000..8d4e49c8d2
--- /dev/null
+++ b/IntelFsp2Pkg/Tools/FspGenCfgData.py
@@ -0,0 +1,2637 @@
+# @ GenCfgData.py=0D
+#=0D
+# Copyright (c) 2014 - 2021, Intel Corporation. All rights reserved.<BR>=0D
+# SPDX-License-Identifier: BSD-2-Clause-Patent=0D
+#=0D
+##=0D
+=0D
+import os=0D
+import re=0D
+import sys=0D
+import marshal=0D
+from functools import reduce=0D
+from datetime import date=0D
+=0D
+# Generated file copyright header=0D
+=0D
+__copyright_tmp__ =3D """/** @file=0D
+=0D
+ Configuration %s File.=0D
+=0D
+ Copyright (c) %4d, Intel Corporation. All rights reserved.<BR>=0D
+ SPDX-License-Identifier: BSD-2-Clause-Patent=0D
+=0D
+ This file is automatically generated. Please do NOT modify !!!=0D
+=0D
+**/=0D
+"""=0D
+=0D
+__copyright_dsc__ =3D """## @file=0D
+#=0D
+# Copyright (c) %04d, Intel Corporation. All rights reserved.<BR>=0D
+# SPDX-License-Identifier: BSD-2-Clause-Patent=0D
+#=0D
+##=0D
+=0D
+[PcdsDynamicVpd.Upd]=0D
+ #=0D
+ # Global definitions in BSF=0D
+ # !BSF BLOCK:{NAME:"FSP UPD Configuration", VER:"0.1"}=0D
+ #=0D
+=0D
+"""=0D
+=0D
+=0D
+def Bytes2Val(Bytes):=0D
+ return reduce(lambda x, y: (x << 8) | y, Bytes[::-1])=0D
+=0D
+=0D
+def Bytes2Str(Bytes):=0D
+ return '{ %s }' % (', '.join('0x%02X' % i for i in Bytes))=0D
+=0D
+=0D
+def Str2Bytes(Value, Blen):=0D
+ Result =3D bytearray(Value[1:-1], 'utf-8') # Excluding quotes=0D
+ if len(Result) < Blen:=0D
+ Result.extend(b'\x00' * (Blen - len(Result)))=0D
+ return Result=0D
+=0D
+=0D
+def Val2Bytes(Value, Blen):=0D
+ return [(Value >> (i * 8) & 0xff) for i in range(Blen)]=0D
+=0D
+=0D
+def Array2Val(ValStr):=0D
+ ValStr =3D ValStr.strip()=0D
+ if ValStr.startswith('{'):=0D
+ ValStr =3D ValStr[1:]=0D
+ if ValStr.endswith('}'):=0D
+ ValStr =3D ValStr[:-1]=0D
+ if ValStr.startswith("'"):=0D
+ ValStr =3D ValStr[1:]=0D
+ if ValStr.endswith("'"):=0D
+ ValStr =3D ValStr[:-1]=0D
+ Value =3D 0=0D
+ for Each in ValStr.split(',')[::-1]:=0D
+ Each =3D Each.strip()=0D
+ if Each.startswith('0x'):=0D
+ Base =3D 16=0D
+ else:=0D
+ Base =3D 10=0D
+ Value =3D (Value << 8) | int(Each, Base)=0D
+ return Value=0D
+=0D
+=0D
+def GetCopyrightHeader(FileType, AllowModify=3DFalse):=0D
+ FileDescription =3D {=0D
+ 'bsf': 'Boot Setting',=0D
+ 'dsc': 'Definition',=0D
+ 'dlt': 'Delta',=0D
+ 'inc': 'C Binary Blob',=0D
+ 'h': 'C Struct Header'=0D
+ }=0D
+ if FileType in ['bsf', 'dsc', 'dlt']:=0D
+ CommentChar =3D '#'=0D
+ else:=0D
+ CommentChar =3D ''=0D
+ Lines =3D __copyright_tmp__.split('\n')=0D
+=0D
+ if AllowModify:=0D
+ Lines =3D [Line for Line in Lines if 'Please do NOT modify' not in=
Line]=0D
+=0D
+ CopyrightHdr =3D '\n'.join('%s%s' % (=0D
+ CommentChar, Line) for Line in Lines)[:-1] + '\n'=0D
+=0D
+ return CopyrightHdr % (FileDescription[FileType], date.today().year)=0D
+=0D
+=0D
+class CLogicalExpression:=0D
+ def __init__(self):=0D
+ self.index =3D 0=0D
+ self.string =3D ''=0D
+=0D
+ def errExit(self, err=3D''):=0D
+ print("ERROR: Express parsing for:")=0D
+ print(" %s" % self.string)=0D
+ print(" %s^" % (' ' * self.index))=0D
+ if err:=0D
+ print("INFO : %s" % err)=0D
+ raise SystemExit=0D
+=0D
+ def getNonNumber(self, n1, n2):=0D
+ if not n1.isdigit():=0D
+ return n1=0D
+ if not n2.isdigit():=0D
+ return n2=0D
+ return None=0D
+=0D
+ def getCurr(self, lens=3D1):=0D
+ try:=0D
+ if lens =3D=3D -1:=0D
+ return self.string[self.index:]=0D
+ else:=0D
+ if self.index + lens > len(self.string):=0D
+ lens =3D len(self.string) - self.index=0D
+ return self.string[self.index: self.index + lens]=0D
+ except Exception:=0D
+ return ''=0D
+=0D
+ def isLast(self):=0D
+ return self.index =3D=3D len(self.string)=0D
+=0D
+ def moveNext(self, len=3D1):=0D
+ self.index +=3D len=0D
+=0D
+ def skipSpace(self):=0D
+ while not self.isLast():=0D
+ if self.getCurr() in ' \t':=0D
+ self.moveNext()=0D
+ else:=0D
+ return=0D
+=0D
+ def normNumber(self, val):=0D
+ return True if val else False=0D
+=0D
+ def getNumber(self, var):=0D
+ var =3D var.strip()=0D
+ if re.match('^0x[a-fA-F0-9]+$', var):=0D
+ value =3D int(var, 16)=0D
+ elif re.match('^[+-]?\\d+$', var):=0D
+ value =3D int(var, 10)=0D
+ else:=0D
+ value =3D None=0D
+ return value=0D
+=0D
+ def parseValue(self):=0D
+ self.skipSpace()=0D
+ var =3D ''=0D
+ while not self.isLast():=0D
+ char =3D self.getCurr()=0D
+ if re.match('^[\\w.]', char):=0D
+ var +=3D char=0D
+ self.moveNext()=0D
+ else:=0D
+ break=0D
+ val =3D self.getNumber(var)=0D
+ if val is None:=0D
+ value =3D var=0D
+ else:=0D
+ value =3D "%d" % val=0D
+ return value=0D
+=0D
+ def parseSingleOp(self):=0D
+ self.skipSpace()=0D
+ if re.match('^NOT\\W', self.getCurr(-1)):=0D
+ self.moveNext(3)=0D
+ op =3D self.parseBrace()=0D
+ val =3D self.getNumber(op)=0D
+ if val is None:=0D
+ self.errExit("'%s' is not a number" % op)=0D
+ return "%d" % (not self.normNumber(int(op)))=0D
+ else:=0D
+ return self.parseValue()=0D
+=0D
+ def parseBrace(self):=0D
+ self.skipSpace()=0D
+ char =3D self.getCurr()=0D
+ if char =3D=3D '(':=0D
+ self.moveNext()=0D
+ value =3D self.parseExpr()=0D
+ self.skipSpace()=0D
+ if self.getCurr() !=3D ')':=0D
+ self.errExit("Expecting closing brace or operator")=0D
+ self.moveNext()=0D
+ return value=0D
+ else:=0D
+ value =3D self.parseSingleOp()=0D
+ return value=0D
+=0D
+ def parseCompare(self):=0D
+ value =3D self.parseBrace()=0D
+ while True:=0D
+ self.skipSpace()=0D
+ char =3D self.getCurr()=0D
+ if char in ['<', '>']:=0D
+ self.moveNext()=0D
+ next =3D self.getCurr()=0D
+ if next =3D=3D '=3D':=0D
+ op =3D char + next=0D
+ self.moveNext()=0D
+ else:=0D
+ op =3D char=0D
+ result =3D self.parseBrace()=0D
+ test =3D self.getNonNumber(result, value)=0D
+ if test is None:=0D
+ value =3D "%d" % self.normNumber(eval(value + op + res=
ult))=0D
+ else:=0D
+ self.errExit("'%s' is not a valid number for comparisi=
on"=0D
+ % test)=0D
+ elif char in ['=3D', '!']:=0D
+ op =3D self.getCurr(2)=0D
+ if op in ['=3D=3D', '!=3D']:=0D
+ self.moveNext(2)=0D
+ result =3D self.parseBrace()=0D
+ test =3D self.getNonNumber(result, value)=0D
+ if test is None:=0D
+ value =3D "%d" % self.normNumber((eval(value + op=
=0D
+ + result)))=0D
+ else:=0D
+ value =3D "%d" % self.normNumber(eval("'" + value =
+=0D
+ "'" + op + "'"=
+=0D
+ result + "'"))=
=0D
+ else:=0D
+ break=0D
+ else:=0D
+ break=0D
+ return value=0D
+=0D
+ def parseAnd(self):=0D
+ value =3D self.parseCompare()=0D
+ while True:=0D
+ self.skipSpace()=0D
+ if re.match('^AND\\W', self.getCurr(-1)):=0D
+ self.moveNext(3)=0D
+ result =3D self.parseCompare()=0D
+ test =3D self.getNonNumber(result, value)=0D
+ if test is None:=0D
+ value =3D "%d" % self.normNumber(int(value) & int(resu=
lt))=0D
+ else:=0D
+ self.errExit("'%s' is not a valid op number for AND" %=
=0D
+ test)=0D
+ else:=0D
+ break=0D
+ return value=0D
+=0D
+ def parseOrXor(self):=0D
+ value =3D self.parseAnd()=0D
+ op =3D None=0D
+ while True:=0D
+ self.skipSpace()=0D
+ op =3D None=0D
+ if re.match('^XOR\\W', self.getCurr(-1)):=0D
+ self.moveNext(3)=0D
+ op =3D '^'=0D
+ elif re.match('^OR\\W', self.getCurr(-1)):=0D
+ self.moveNext(2)=0D
+ op =3D '|'=0D
+ else:=0D
+ break=0D
+ if op:=0D
+ result =3D self.parseAnd()=0D
+ test =3D self.getNonNumber(result, value)=0D
+ if test is None:=0D
+ value =3D "%d" % self.normNumber(eval(value + op + res=
ult))=0D
+ else:=0D
+ self.errExit("'%s' is not a valid op number for XOR/OR=
" %=0D
+ test)=0D
+ return value=0D
+=0D
+ def parseExpr(self):=0D
+ return self.parseOrXor()=0D
+=0D
+ def getResult(self):=0D
+ value =3D self.parseExpr()=0D
+ self.skipSpace()=0D
+ if not self.isLast():=0D
+ self.errExit("Unexpected character found '%s'" % self.getCurr(=
))=0D
+ test =3D self.getNumber(value)=0D
+ if test is None:=0D
+ self.errExit("Result '%s' is not a number" % value)=0D
+ return int(value)=0D
+=0D
+ def evaluateExpress(self, Expr):=0D
+ self.index =3D 0=0D
+ self.string =3D Expr=0D
+ if self.getResult():=0D
+ Result =3D True=0D
+ else:=0D
+ Result =3D False=0D
+ return Result=0D
+=0D
+=0D
+class CFspBsf2Dsc:=0D
+=0D
+ def __init__(self, bsf_file):=0D
+ self.cfg_list =3D CFspBsf2Dsc.parse_bsf(bsf_file)=0D
+=0D
+ def get_dsc_lines(self):=0D
+ return CFspBsf2Dsc.generate_dsc(self.cfg_list)=0D
+=0D
+ def save_dsc(self, dsc_file):=0D
+ return CFspBsf2Dsc.generate_dsc(self.cfg_list, dsc_file)=0D
+=0D
+ @staticmethod=0D
+ def parse_bsf(bsf_file):=0D
+=0D
+ fd =3D open(bsf_file, 'r')=0D
+ bsf_txt =3D fd.read()=0D
+ fd.close()=0D
+=0D
+ find_list =3D []=0D
+ regex =3D re.compile(r'\s+Find\s+"(.*?)"(.*?)^\s+(\$(.*?)|Skip)\s+=
',=0D
+ re.S | re.MULTILINE)=0D
+ for match in regex.finditer(bsf_txt):=0D
+ find =3D match.group(1)=0D
+ name =3D match.group(3)=0D
+ line =3D bsf_txt[:match.end()].count("\n")=0D
+ find_list.append((name, find, line))=0D
+=0D
+ idx =3D 0=0D
+ count =3D 0=0D
+ prefix =3D ''=0D
+ chk_dict =3D {}=0D
+ cfg_list =3D []=0D
+ cfg_temp =3D {'find': '', 'cname': '', 'length': 0, 'value': '0',=
=0D
+ 'type': 'Reserved', 'isbit': False,=0D
+ 'embed': '', 'page': '', 'option': '', 'instance': 0}=
=0D
+ regex =3D re.compile(=0D
+ r'^\s+(\$(.*?)|Skip)\s+(\d+)\s+(bits|bytes)(\s+\$_DEFAULT_\s'=
=0D
+ r'+=3D\s+(.+?))?$', re.S |=0D
+ re.MULTILINE)=0D
+=0D
+ for match in regex.finditer(bsf_txt):=0D
+ dlen =3D int(match.group(3))=0D
+ if match.group(1) =3D=3D 'Skip':=0D
+ key =3D 'gPlatformFspPkgTokenSpaceGuid_BsfSkip%d' % idx=0D
+ val =3D ', '.join(['%02X' % ord(i) for i in '\x00' * dlen]=
)=0D
+ idx +=3D 1=0D
+ option =3D '$SKIP'=0D
+ else:=0D
+ key =3D match.group(2)=0D
+ val =3D match.group(6)=0D
+ option =3D ''=0D
+ is_bit =3D True if match.group(4) =3D=3D 'bits' else False=0D
+=0D
+ cfg_item =3D dict(cfg_temp)=0D
+ line =3D bsf_txt[:match.end()].count("\n")=0D
+ finds =3D [i for i in find_list if line >=3D i[2]]=0D
+ if len(finds) > 0:=0D
+ prefix =3D finds[0][1]=0D
+ cfg_item['embed'] =3D '%s:TAG_%03X:START' % \=0D
+ (prefix, ord(prefix[-1]))=0D
+ cfg_item['find'] =3D prefix=0D
+ cfg_item['cname'] =3D 'Signature'=0D
+ cfg_item['length'] =3D len(finds[0][1])=0D
+ str2byte =3D Str2Bytes("'" + finds[0][1] + "'",=0D
+ len(finds[0][1]))=0D
+ cfg_item['value'] =3D '0x%X' % Bytes2Val(str2byte)=0D
+=0D
+ cfg_list.append(dict(cfg_item))=0D
+ cfg_item =3D dict(cfg_temp)=0D
+ find_list.pop(0)=0D
+ count =3D 0=0D
+=0D
+ cfg_item['cname'] =3D key=0D
+ cfg_item['length'] =3D dlen=0D
+ cfg_item['value'] =3D val=0D
+ cfg_item['option'] =3D option=0D
+ cfg_item['isbit'] =3D is_bit=0D
+=0D
+ if key not in chk_dict.keys():=0D
+ chk_dict[key] =3D 0=0D
+ else:=0D
+ chk_dict[key] +=3D 1=0D
+ cfg_item['instance'] =3D chk_dict[key]=0D
+=0D
+ cfg_list.append(cfg_item)=0D
+ count +=3D 1=0D
+=0D
+ if prefix:=0D
+ cfg_item =3D dict(cfg_temp)=0D
+ cfg_item['cname'] =3D 'Dummy'=0D
+ cfg_item['embed'] =3D '%s:%03X:END' % (prefix, ord(prefix[-1])=
)=0D
+ cfg_list.append(cfg_item)=0D
+=0D
+ option_dict =3D {}=0D
+ selreg =3D re.compile(=0D
+ r'\s+Selection\s*(.+?)\s*,\s*"(.*?)"$', re.S |=0D
+ re.MULTILINE)=0D
+ regex =3D re.compile(=0D
+ r'^List\s&(.+?)$(.+?)^EndList$', re.S | re.MULTILINE)=0D
+ for match in regex.finditer(bsf_txt):=0D
+ key =3D match.group(1)=0D
+ option_dict[key] =3D []=0D
+ for select in selreg.finditer(match.group(2)):=0D
+ option_dict[key].append(=0D
+ (int(select.group(1), 0), select.group(2)))=0D
+=0D
+ chk_dict =3D {}=0D
+ pagereg =3D re.compile(=0D
+ r'^Page\s"(.*?)"$(.+?)^EndPage$', re.S | re.MULTILINE)=0D
+ for match in pagereg.finditer(bsf_txt):=0D
+ page =3D match.group(1)=0D
+ for line in match.group(2).splitlines():=0D
+ match =3D re.match(=0D
+ r'\s+(Combo|EditNum)\s\$(.+?),\s"(.*?)",\s(.+?),$', li=
ne)=0D
+ if match:=0D
+ cname =3D match.group(2)=0D
+ if cname not in chk_dict.keys():=0D
+ chk_dict[cname] =3D 0=0D
+ else:=0D
+ chk_dict[cname] +=3D 1=0D
+ instance =3D chk_dict[cname]=0D
+ cfg_idxs =3D [i for i, j in enumerate(cfg_list)=0D
+ if j['cname'] =3D=3D cname and=0D
+ j['instance'] =3D=3D instance]=0D
+ if len(cfg_idxs) !=3D 1:=0D
+ raise Exception(=0D
+ "Multiple CFG item '%s' found !" % cname)=0D
+ cfg_item =3D cfg_list[cfg_idxs[0]]=0D
+ cfg_item['page'] =3D page=0D
+ cfg_item['type'] =3D match.group(1)=0D
+ cfg_item['prompt'] =3D match.group(3)=0D
+ cfg_item['range'] =3D None=0D
+ if cfg_item['type'] =3D=3D 'Combo':=0D
+ cfg_item['option'] =3D option_dict[match.group(4)[=
1:]]=0D
+ elif cfg_item['type'] =3D=3D 'EditNum':=0D
+ cfg_item['option'] =3D match.group(4)=0D
+ match =3D re.match(r'\s+ Help\s"(.*?)"$', line)=0D
+ if match:=0D
+ cfg_item['help'] =3D match.group(1)=0D
+=0D
+ match =3D re.match(r'\s+"Valid\srange:\s(.*)"$', line)=0D
+ if match:=0D
+ parts =3D match.group(1).split()=0D
+ cfg_item['option'] =3D (=0D
+ (int(parts[0], 0), int(parts[2], 0),=0D
+ cfg_item['option']))=0D
+=0D
+ return cfg_list=0D
+=0D
+ @staticmethod=0D
+ def generate_dsc(option_list, dsc_file=3DNone):=0D
+ dsc_lines =3D []=0D
+ header =3D '%s' % (__copyright_dsc__ % date.today().year)=0D
+ dsc_lines.extend(header.splitlines())=0D
+=0D
+ pages =3D []=0D
+ for cfg_item in option_list:=0D
+ if cfg_item['page'] and (cfg_item['page'] not in pages):=0D
+ pages.append(cfg_item['page'])=0D
+=0D
+ page_id =3D 0=0D
+ for page in pages:=0D
+ dsc_lines.append(' # !BSF PAGES:{PG%02X::"%s"}' % (page_id, p=
age))=0D
+ page_id +=3D 1=0D
+ dsc_lines.append('')=0D
+=0D
+ last_page =3D ''=0D
+=0D
+ is_bit =3D False=0D
+ dlen =3D 0=0D
+ dval =3D 0=0D
+ bit_fields =3D []=0D
+ for idx, option in enumerate(option_list):=0D
+ if not is_bit and option['isbit']:=0D
+ is_bit =3D True=0D
+ dlen =3D 0=0D
+ dval =3D 0=0D
+ idxs =3D idx=0D
+ if is_bit and not option['isbit']:=0D
+ is_bit =3D False=0D
+ if dlen % 8 !=3D 0:=0D
+ raise Exception("Bit fields are not aligned at "=0D
+ "byte boundary !")=0D
+ bit_fields.append((idxs, idx, dlen, dval))=0D
+ if is_bit:=0D
+ blen =3D option['length']=0D
+ bval =3D int(option['value'], 0)=0D
+ dval =3D dval + ((bval & ((1 << blen) - 1)) << dlen)=0D
+ print(dlen, blen, bval, hex(dval))=0D
+ dlen +=3D blen=0D
+=0D
+ struct_idx =3D 0=0D
+ for idx, option in enumerate(option_list):=0D
+ dsc_lines.append('')=0D
+ default =3D option['value']=0D
+ pos =3D option['cname'].find('_')=0D
+ name =3D option['cname'][pos + 1:]=0D
+=0D
+ for start_idx, end_idx, bits_len, bits_val in bit_fields:=0D
+ if idx =3D=3D start_idx:=0D
+ val_str =3D Bytes2Str(Val2Bytes(bits_val, bits_len // =
8))=0D
+ dsc_lines.append(' # !HDR STRUCT:{BIT_FIELD_DATA_%d}'=
=0D
+ % struct_idx)=0D
+ dsc_lines.append(' # !BSF NAME:{BIT_FIELD_STRUCT}')=0D
+ dsc_lines.append(' gCfgData.BitFiledStruct%d =
'=0D
+ ' | * | 0x%04X | %s' %=0D
+ (struct_idx, bits_len // 8, val_str))=
=0D
+ dsc_lines.append('')=0D
+ struct_idx +=3D 1=0D
+=0D
+ if option['find']:=0D
+ dsc_lines.append(' # !BSF FIND:{%s}' % option['find'])=0D
+ dsc_lines.append('')=0D
+=0D
+ if option['instance'] > 0:=0D
+ name =3D name + '_%s' % option['instance']=0D
+=0D
+ if option['embed']:=0D
+ dsc_lines.append(' # !HDR EMBED:{%s}' % option['embed'])=
=0D
+=0D
+ if option['type'] =3D=3D 'Reserved':=0D
+ dsc_lines.append(' # !BSF NAME:{Reserved} TYPE:{Reserved}=
')=0D
+ if option['option'] =3D=3D '$SKIP':=0D
+ dsc_lines.append(' # !BSF OPTION:{$SKIP}')=0D
+ else:=0D
+ prompt =3D option['prompt']=0D
+=0D
+ if last_page !=3D option['page']:=0D
+ last_page =3D option['page']=0D
+ dsc_lines.append(' # !BSF PAGE:{PG%02X}' %=0D
+ (pages.index(option['page'])))=0D
+=0D
+ if option['type'] =3D=3D 'Combo':=0D
+ dsc_lines.append(' # !BSF NAME:{%s} TYPE:{%s}' %=0D
+ (prompt, option['type']))=0D
+ ops =3D []=0D
+ for val, text in option['option']:=0D
+ ops.append('0x%x:%s' % (val, text))=0D
+ dsc_lines.append(' # !BSF OPTION:{%s}' % (', '.join(o=
ps)))=0D
+ elif option['type'] =3D=3D 'EditNum':=0D
+ cfg_len =3D option['length']=0D
+ if ',' in default and cfg_len > 8:=0D
+ dsc_lines.append(' # !BSF NAME:{%s} TYPE:{Table}'=
%=0D
+ (prompt))=0D
+ if cfg_len > 16:=0D
+ cfg_len =3D 16=0D
+ ops =3D []=0D
+ for i in range(cfg_len):=0D
+ ops.append('%X:1:HEX' % i)=0D
+ dsc_lines.append(' # !BSF OPTION:{%s}' %=0D
+ (', '.join(ops)))=0D
+ else:=0D
+ dsc_lines.append(=0D
+ ' # !BSF NAME:{%s} TYPE:{%s, %s, (0x%X, 0x%X)=
}' %=0D
+ (prompt, option['type'], option['option'][2],=
=0D
+ option['option'][0], option['option'][1]))=0D
+ dsc_lines.append(' # !BSF HELP:{%s}' % option['help'])=0D
+=0D
+ if ',' in default:=0D
+ default =3D '{%s}' % default=0D
+=0D
+ if option['isbit']:=0D
+ dsc_lines.append(' # !BSF FIELD:{%s:%db}'=0D
+ % (name, option['length']))=0D
+ else:=0D
+ dsc_lines.append(' gCfgData.%-30s | * | 0x%04X | %s' %=0D
+ (name, option['length'], default))=0D
+=0D
+ if dsc_file:=0D
+ fd =3D open(dsc_file, 'w')=0D
+ fd.write('\n'.join(dsc_lines))=0D
+ fd.close()=0D
+=0D
+ return dsc_lines=0D
+=0D
+=0D
+class CGenCfgData:=0D
+ def __init__(self, Mode=3D''):=0D
+ self.Debug =3D False=0D
+ self.Error =3D ''=0D
+ self.ReleaseMode =3D True=0D
+ self.Mode =3D Mode=0D
+ self._GlobalDataDef =3D """=0D
+GlobalDataDef=0D
+ SKUID =3D 0, "DEFAULT"=0D
+EndGlobalData=0D
+=0D
+"""=0D
+ self._BuidinOptionTxt =3D """=0D
+List &EN_DIS=0D
+ Selection 0x1 , "Enabled"=0D
+ Selection 0x0 , "Disabled"=0D
+EndList=0D
+=0D
+"""=0D
+ self._StructType =3D ['UINT8', 'UINT16', 'UINT32', 'UINT64']=0D
+ self._BsfKeyList =3D ['FIND', 'NAME', 'HELP', 'TYPE', 'PAGE', 'PAG=
ES',=0D
+ 'BLOCK', 'OPTION', 'CONDITION', 'ORDER', 'MARK=
ER',=0D
+ 'SUBT']=0D
+ self._HdrKeyList =3D ['HEADER', 'STRUCT', 'EMBED', 'COMMENT']=0D
+ self._BuidinOption =3D {'$EN_DIS': 'EN_DIS'}=0D
+=0D
+ self._MacroDict =3D {}=0D
+ self._VarDict =3D {}=0D
+ self._PcdsDict =3D {}=0D
+ self._CfgBlkDict =3D {}=0D
+ self._CfgPageDict =3D {}=0D
+ self._CfgOptsDict =3D {}=0D
+ self._BsfTempDict =3D {}=0D
+ self._CfgItemList =3D []=0D
+ self._DscLines =3D []=0D
+ self._DscFile =3D ''=0D
+ self._CfgPageTree =3D {}=0D
+=0D
+ self._MapVer =3D 0=0D
+ self._MinCfgTagId =3D 0x100=0D
+=0D
+ def ParseMacros(self, MacroDefStr):=0D
+ # ['-DABC=3D1', '-D', 'CFG_DEBUG=3D1', '-D', 'CFG_OUTDIR=3DBuild']=
=0D
+ self._MacroDict =3D {}=0D
+ IsExpression =3D False=0D
+ for Macro in MacroDefStr:=0D
+ if Macro.startswith('-D'):=0D
+ IsExpression =3D True=0D
+ if len(Macro) > 2:=0D
+ Macro =3D Macro[2:]=0D
+ else:=0D
+ continue=0D
+ if IsExpression:=0D
+ IsExpression =3D False=0D
+ Match =3D re.match("(\\w+)=3D(.+)", Macro)=0D
+ if Match:=0D
+ self._MacroDict[Match.group(1)] =3D Match.group(2)=0D
+ else:=0D
+ Match =3D re.match("(\\w+)", Macro)=0D
+ if Match:=0D
+ self._MacroDict[Match.group(1)] =3D ''=0D
+ if len(self._MacroDict) =3D=3D 0:=0D
+ Error =3D 1=0D
+ else:=0D
+ Error =3D 0=0D
+ if self.Debug:=0D
+ print("INFO : Macro dictionary:")=0D
+ for Each in self._MacroDict:=0D
+ print(" $(%s) =3D [ %s ]" % (Each,=0D
+ self._MacroDict[Each]=
))=0D
+ return Error=0D
+=0D
+ def EvaulateIfdef(self, Macro):=0D
+ Result =3D Macro in self._MacroDict=0D
+ if self.Debug:=0D
+ print("INFO : Eval Ifdef [%s] : %s" % (Macro, Result))=0D
+ return Result=0D
+=0D
+ def ExpandMacros(self, Input, Preserve=3DFalse):=0D
+ Line =3D Input=0D
+ Match =3D re.findall("\\$\\(\\w+\\)", Input)=0D
+ if Match:=0D
+ for Each in Match:=0D
+ Variable =3D Each[2:-1]=0D
+ if Variable in self._MacroDict:=0D
+ Line =3D Line.replace(Each, self._MacroDict[Variable])=
=0D
+ else:=0D
+ if self.Debug:=0D
+ print("WARN : %s is not defined" % Each)=0D
+ if not Preserve:=0D
+ Line =3D Line.replace(Each, Each[2:-1])=0D
+ return Line=0D
+=0D
+ def ExpandPcds(self, Input):=0D
+ Line =3D Input=0D
+ Match =3D re.findall("(\\w+\\.\\w+)", Input)=0D
+ if Match:=0D
+ for PcdName in Match:=0D
+ if PcdName in self._PcdsDict:=0D
+ Line =3D Line.replace(PcdName, self._PcdsDict[PcdName]=
)=0D
+ else:=0D
+ if self.Debug:=0D
+ print("WARN : %s is not defined" % PcdName)=0D
+ return Line=0D
+=0D
+ def EvaluateExpress(self, Expr):=0D
+ ExpExpr =3D self.ExpandPcds(Expr)=0D
+ ExpExpr =3D self.ExpandMacros(ExpExpr)=0D
+ LogExpr =3D CLogicalExpression()=0D
+ Result =3D LogExpr.evaluateExpress(ExpExpr)=0D
+ if self.Debug:=0D
+ print("INFO : Eval Express [%s] : %s" % (Expr, Result))=0D
+ return Result=0D
+=0D
+ def ValueToByteArray(self, ValueStr, Length):=0D
+ Match =3D re.match("\\{\\s*FILE:(.+)\\}", ValueStr)=0D
+ if Match:=0D
+ FileList =3D Match.group(1).split(',')=0D
+ Result =3D bytearray()=0D
+ for File in FileList:=0D
+ File =3D File.strip()=0D
+ BinPath =3D os.path.join(os.path.dirname(self._DscFile), F=
ile)=0D
+ Result.extend(bytearray(open(BinPath, 'rb').read()))=0D
+ else:=0D
+ try:=0D
+ Result =3D bytearray(self.ValueToList(ValueStr, Length))=0D
+ except ValueError:=0D
+ raise Exception("Bytes in '%s' must be in range 0~255 !" %=
=0D
+ ValueStr)=0D
+ if len(Result) < Length:=0D
+ Result.extend(b'\x00' * (Length - len(Result)))=0D
+ elif len(Result) > Length:=0D
+ raise Exception("Value '%s' is too big to fit into %d bytes !"=
%=0D
+ (ValueStr, Length))=0D
+=0D
+ return Result[:Length]=0D
+=0D
+ def ValueToList(self, ValueStr, Length):=0D
+ if ValueStr[0] =3D=3D '{':=0D
+ Result =3D []=0D
+ BinList =3D ValueStr[1:-1].split(',')=0D
+ InBitField =3D False=0D
+ LastInBitField =3D False=0D
+ Value =3D 0=0D
+ BitLen =3D 0=0D
+ for Element in BinList:=0D
+ InBitField =3D False=0D
+ Each =3D Element.strip()=0D
+ if len(Each) =3D=3D 0:=0D
+ pass=0D
+ else:=0D
+ if Each[0] in ['"', "'"]:=0D
+ Result.extend(list(bytearray(Each[1:-1], 'utf-8'))=
)=0D
+ elif ':' in Each:=0D
+ Match =3D re.match("(.+):(\\d+)b", Each)=0D
+ if Match is None:=0D
+ raise Exception("Invald value list format '%s'=
!"=0D
+ % Each)=0D
+ InBitField =3D True=0D
+ CurrentBitLen =3D int(Match.group(2))=0D
+ CurrentValue =3D ((self.EvaluateExpress(Match.grou=
p(1))=0D
+ & (1 << CurrentBitLen) - 1)) << B=
itLen=0D
+ else:=0D
+ Result.append(self.EvaluateExpress(Each.strip()))=
=0D
+ if InBitField:=0D
+ Value +=3D CurrentValue=0D
+ BitLen +=3D CurrentBitLen=0D
+ if LastInBitField and ((not InBitField) or (Element =3D=3D=
=0D
+ BinList[-1])):=
=0D
+ if BitLen % 8 !=3D 0:=0D
+ raise Exception("Invald bit field length!")=0D
+ Result.extend(Val2Bytes(Value, BitLen // 8))=0D
+ Value =3D 0=0D
+ BitLen =3D 0=0D
+ LastInBitField =3D InBitField=0D
+ elif ValueStr.startswith("'") and ValueStr.endswith("'"):=0D
+ Result =3D Str2Bytes(ValueStr, Length)=0D
+ elif ValueStr.startswith('"') and ValueStr.endswith('"'):=0D
+ Result =3D Str2Bytes(ValueStr, Length)=0D
+ else:=0D
+ Result =3D Val2Bytes(self.EvaluateExpress(ValueStr), Length)=0D
+ return Result=0D
+=0D
+ def FormatDeltaValue(self, ConfigDict):=0D
+ ValStr =3D ConfigDict['value']=0D
+ if ValStr[0] =3D=3D "'":=0D
+ # Remove padding \x00 in the value string=0D
+ ValStr =3D "'%s'" % ValStr[1:-1].rstrip('\x00')=0D
+=0D
+ Struct =3D ConfigDict['struct']=0D
+ if Struct in self._StructType:=0D
+ # Format the array using its struct type=0D
+ Unit =3D int(Struct[4:]) // 8=0D
+ Value =3D Array2Val(ConfigDict['value'])=0D
+ Loop =3D ConfigDict['length'] // Unit=0D
+ Values =3D []=0D
+ for Each in range(Loop):=0D
+ Values.append(Value & ((1 << (Unit * 8)) - 1))=0D
+ Value =3D Value >> (Unit * 8)=0D
+ ValStr =3D '{ ' + ', '.join([('0x%%0%dX' % (Unit * 2)) %=0D
+ x for x in Values]) + ' }'=0D
+=0D
+ return ValStr=0D
+=0D
+ def FormatListValue(self, ConfigDict):=0D
+ Struct =3D ConfigDict['struct']=0D
+ if Struct not in self._StructType:=0D
+ return=0D
+=0D
+ DataList =3D self.ValueToList(ConfigDict['value'], ConfigDict['len=
gth'])=0D
+ Unit =3D int(Struct[4:]) // 8=0D
+ if int(ConfigDict['length']) !=3D Unit * len(DataList):=0D
+ # Fallback to byte array=0D
+ Unit =3D 1=0D
+ if int(ConfigDict['length']) !=3D len(DataList):=0D
+ raise Exception("Array size is not proper for '%s' !" %=0D
+ ConfigDict['cname'])=0D
+=0D
+ ByteArray =3D []=0D
+ for Value in DataList:=0D
+ for Loop in range(Unit):=0D
+ ByteArray.append("0x%02X" % (Value & 0xFF))=0D
+ Value =3D Value >> 8=0D
+ NewValue =3D '{' + ','.join(ByteArray) + '}'=0D
+ ConfigDict['value'] =3D NewValue=0D
+=0D
+ return ""=0D
+=0D
+ def GetOrderNumber(self, Offset, Order, BitOff=3D0):=0D
+ if isinstance(Order, int):=0D
+ if Order =3D=3D -1:=0D
+ Order =3D Offset << 16=0D
+ else:=0D
+ (Major, Minor) =3D Order.split('.')=0D
+ Order =3D (int(Major, 16) << 16) + ((int(Minor, 16) & 0xFF) <<=
8)=0D
+ return Order + (BitOff & 0xFF)=0D
+=0D
+ def SubtituteLine(self, Line, Args):=0D
+ Args =3D Args.strip()=0D
+ Vars =3D Args.split(':')=0D
+ Line =3D self.ExpandMacros(Line, True)=0D
+ for Idx in range(len(Vars)-1, 0, -1):=0D
+ Line =3D Line.replace('$(%d)' % Idx, Vars[Idx].strip())=0D
+ return Line=0D
+=0D
+ def CfgDuplicationCheck(self, CfgDict, Name):=0D
+ if not self.Debug:=0D
+ return=0D
+=0D
+ if Name =3D=3D 'Dummy':=0D
+ return=0D
+=0D
+ if Name not in CfgDict:=0D
+ CfgDict[Name] =3D 1=0D
+ else:=0D
+ print("WARNING: Duplicated item found '%s' !" %=0D
+ CfgDict['cname'])=0D
+=0D
+ def AddBsfChildPage(self, Child, Parent=3D'root'):=0D
+ def AddBsfChildPageRecursive(PageTree, Parent, Child):=0D
+ Key =3D next(iter(PageTree))=0D
+ if Parent =3D=3D Key:=0D
+ PageTree[Key].append({Child: []})=0D
+ return True=0D
+ else:=0D
+ Result =3D False=0D
+ for Each in PageTree[Key]:=0D
+ if AddBsfChildPageRecursive(Each, Parent, Child):=0D
+ Result =3D True=0D
+ break=0D
+ return Result=0D
+=0D
+ return AddBsfChildPageRecursive(self._CfgPageTree, Parent, Child)=
=0D
+=0D
+ def ParseDscFile(self, DscFile):=0D
+ self._DscLines =3D []=0D
+ self._CfgItemList =3D []=0D
+ self._CfgPageDict =3D {}=0D
+ self._CfgBlkDict =3D {}=0D
+ self._BsfTempDict =3D {}=0D
+ self._CfgPageTree =3D {'root': []}=0D
+=0D
+ CfgDict =3D {}=0D
+=0D
+ SectionNameList =3D ["Defines".lower(), "PcdsFeatureFlag".lower(),=
=0D
+ "PcdsDynamicVpd.Tmp".lower(),=0D
+ "PcdsDynamicVpd.Upd".lower()]=0D
+=0D
+ IsDefSect =3D False=0D
+ IsPcdSect =3D False=0D
+ IsUpdSect =3D False=0D
+ IsTmpSect =3D False=0D
+=0D
+ TemplateName =3D ''=0D
+=0D
+ IfStack =3D []=0D
+ ElifStack =3D []=0D
+ Error =3D 0=0D
+ ConfigDict =3D {}=0D
+=0D
+ if type(DscFile) is list:=0D
+ # it is DSC lines already=0D
+ DscLines =3D DscFile=0D
+ self._DscFile =3D '.'=0D
+ else:=0D
+ DscFd =3D open(DscFile, "r")=0D
+ DscLines =3D DscFd.readlines()=0D
+ DscFd.close()=0D
+ self._DscFile =3D DscFile=0D
+=0D
+ BsfRegExp =3D re.compile("(%s):{(.+?)}(?:$|\\s+)" % '|'.=0D
+ join(self._BsfKeyList))=0D
+ HdrRegExp =3D re.compile("(%s):{(.+?)}" % '|'.join(self._HdrKeyLis=
t))=0D
+ CfgRegExp =3D re.compile("^([_a-zA-Z0-9]+)\\s*\\|\\s*\=0D
+(0x[0-9A-F]+|\\*)\\s*\\|\\s*(\\d+|0x[0-9a-fA-F]+)\\s*\\|\\s*(.+)")=0D
+ TksRegExp =3D re.compile("^(g[_a-zA-Z0-9]+\\.)(.+)")=0D
+ SkipLines =3D 0=0D
+ while len(DscLines):=0D
+ DscLine =3D DscLines.pop(0).strip()=0D
+ if SkipLines =3D=3D 0:=0D
+ self._DscLines.append(DscLine)=0D
+ else:=0D
+ SkipLines =3D SkipLines - 1=0D
+ if len(DscLine) =3D=3D 0:=0D
+ continue=0D
+=0D
+ Handle =3D False=0D
+ Match =3D re.match("^\\[(.+)\\]", DscLine)=0D
+ if Match is not None:=0D
+ IsDefSect =3D False=0D
+ IsPcdSect =3D False=0D
+ IsUpdSect =3D False=0D
+ IsTmpSect =3D False=0D
+ SectionName =3D Match.group(1).lower()=0D
+ if SectionName =3D=3D SectionNameList[0]:=0D
+ IsDefSect =3D True=0D
+ if SectionName =3D=3D SectionNameList[1]:=0D
+ IsPcdSect =3D True=0D
+ elif SectionName =3D=3D SectionNameList[2]:=0D
+ IsTmpSect =3D True=0D
+ elif SectionName =3D=3D SectionNameList[3]:=0D
+ ConfigDict =3D {=0D
+ 'header': 'ON',=0D
+ 'page': '',=0D
+ 'name': '',=0D
+ 'find': '',=0D
+ 'struct': '',=0D
+ 'embed': '',=0D
+ 'marker': '',=0D
+ 'option': '',=0D
+ 'comment': '',=0D
+ 'condition': '',=0D
+ 'order': -1,=0D
+ 'subreg': []=0D
+ }=0D
+ IsUpdSect =3D True=0D
+ Offset =3D 0=0D
+ else:=0D
+ if IsDefSect or IsPcdSect or IsUpdSect or IsTmpSect:=0D
+ Match =3D False if DscLine[0] !=3D '!' else True=0D
+ if Match:=0D
+ Match =3D re.match("^!(else|endif|ifdef|ifndef|if|=
elseif\=0D
+|include)\\s*(.+)?$", DscLine.split("#")[0])=0D
+ Keyword =3D Match.group(1) if Match else ''=0D
+ Remaining =3D Match.group(2) if Match else ''=0D
+ Remaining =3D '' if Remaining is None else Remaining.s=
trip()=0D
+=0D
+ if Keyword in ['if', 'elseif', 'ifdef', 'ifndef', 'inc=
lude'=0D
+ ] and not Remaining:=0D
+ raise Exception("ERROR: Expression is expected aft=
er \=0D
+'!if' or !elseif' for line '%s'" % DscLine)=0D
+=0D
+ if Keyword =3D=3D 'else':=0D
+ if IfStack:=0D
+ IfStack[-1] =3D not IfStack[-1]=0D
+ else:=0D
+ raise Exception("ERROR: No paired '!if' found =
for \=0D
+'!else' for line '%s'" % DscLine)=0D
+ elif Keyword =3D=3D 'endif':=0D
+ if IfStack:=0D
+ IfStack.pop()=0D
+ Level =3D ElifStack.pop()=0D
+ if Level > 0:=0D
+ del IfStack[-Level:]=0D
+ else:=0D
+ raise Exception("ERROR: No paired '!if' found =
for \=0D
+'!endif' for line '%s'" % DscLine)=0D
+ elif Keyword =3D=3D 'ifdef' or Keyword =3D=3D 'ifndef'=
:=0D
+ Result =3D self.EvaulateIfdef(Remaining)=0D
+ if Keyword =3D=3D 'ifndef':=0D
+ Result =3D not Result=0D
+ IfStack.append(Result)=0D
+ ElifStack.append(0)=0D
+ elif Keyword =3D=3D 'if' or Keyword =3D=3D 'elseif':=0D
+ Result =3D self.EvaluateExpress(Remaining)=0D
+ if Keyword =3D=3D "if":=0D
+ ElifStack.append(0)=0D
+ IfStack.append(Result)=0D
+ else: # elseif=0D
+ if IfStack:=0D
+ IfStack[-1] =3D not IfStack[-1]=0D
+ IfStack.append(Result)=0D
+ ElifStack[-1] =3D ElifStack[-1] + 1=0D
+ else:=0D
+ raise Exception("ERROR: No paired '!if' fo=
und for \=0D
+'!elif' for line '%s'" % DscLine)=0D
+ else:=0D
+ if IfStack:=0D
+ Handle =3D reduce(lambda x, y: x and y, IfStac=
k)=0D
+ else:=0D
+ Handle =3D True=0D
+ if Handle:=0D
+ if Keyword =3D=3D 'include':=0D
+ Remaining =3D self.ExpandMacros(Remaining)=
=0D
+ # Relative to DSC filepath=0D
+ IncludeFilePath =3D os.path.join(=0D
+ os.path.dirname(self._DscFile), Remain=
ing)=0D
+ if not os.path.exists(IncludeFilePath):=0D
+ # Relative to repository to find \=0D
+ # dsc in common platform=0D
+ IncludeFilePath =3D os.path.join(=0D
+ os.path.dirname(self._DscFile), ".=
.",=0D
+ Remaining)=0D
+=0D
+ try:=0D
+ IncludeDsc =3D open(IncludeFilePath, "=
r")=0D
+ except Exception:=0D
+ raise Exception("ERROR: Cannot open \=
=0D
+file '%s'." % IncludeFilePath)=0D
+ NewDscLines =3D IncludeDsc.readlines()=0D
+ IncludeDsc.close()=0D
+ DscLines =3D NewDscLines + DscLines=0D
+ del self._DscLines[-1]=0D
+ else:=0D
+ if DscLine.startswith('!'):=0D
+ raise Exception("ERROR: Unrecoginized =
\=0D
+directive for line '%s'" % DscLine)=0D
+=0D
+ if not Handle:=0D
+ del self._DscLines[-1]=0D
+ continue=0D
+=0D
+ if IsDefSect:=0D
+ Match =3D re.match("^\\s*(?:DEFINE\\s+)*(\\w+)\\s*=3D\\s*(=
.+)",=0D
+ DscLine)=0D
+ if Match:=0D
+ self._MacroDict[Match.group(1)] =3D Match.group(2)=0D
+ if self.Debug:=0D
+ print("INFO : DEFINE %s =3D [ %s ]" % (Match.group=
(1),=0D
+ Match.group(2=
)))=0D
+=0D
+ elif IsPcdSect:=0D
+ Match =3D re.match("^\\s*([\\w\\.]+)\\s*\\|\\s*(\\w+)", Ds=
cLine)=0D
+ if Match:=0D
+ self._PcdsDict[Match.group(1)] =3D Match.group(2)=0D
+ if self.Debug:=0D
+ print("INFO : PCD %s =3D [ %s ]" % (Match.group(1)=
,=0D
+ Match.group(2)))=
=0D
+=0D
+ elif IsTmpSect:=0D
+ # !BSF DEFT:{GPIO_TMPL:START}=0D
+ Match =3D re.match("^\\s*#\\s+(!BSF)\\s+DEFT:{(.+?):\=0D
+(START|END)}", DscLine)=0D
+ if Match:=0D
+ if Match.group(3) =3D=3D 'START' and not TemplateName:=
=0D
+ TemplateName =3D Match.group(2).strip()=0D
+ self._BsfTempDict[TemplateName] =3D []=0D
+ if Match.group(3) =3D=3D 'END' and (=0D
+ TemplateName =3D=3D Match.group(2).strip()=0D
+ ) and TemplateName:=0D
+ TemplateName =3D ''=0D
+ else:=0D
+ if TemplateName:=0D
+ Match =3D re.match("^!include\\s*(.+)?$", DscLine)=
=0D
+ if Match:=0D
+ continue=0D
+ self._BsfTempDict[TemplateName].append(DscLine)=0D
+=0D
+ else:=0D
+ Match =3D re.match("^\\s*#\\s+(!BSF|!HDR)\\s+(.+)", DscLin=
e)=0D
+ if Match:=0D
+ Remaining =3D Match.group(2)=0D
+ if Match.group(1) =3D=3D '!BSF':=0D
+ Result =3D BsfRegExp.findall(Remaining)=0D
+ if Result:=0D
+ for Each in Result:=0D
+ Key =3D Each[0]=0D
+ Remaining =3D Each[1]=0D
+=0D
+ if Key =3D=3D 'BLOCK':=0D
+ Match =3D re.match(=0D
+ "NAME:\"(.+)\"\\s*,\\s*\=0D
+VER:\"(.+)\"\\s*", Remaining)=0D
+ if Match:=0D
+ self._CfgBlkDict['name'] =3D \=0D
+ Match.gro=
up(1)=0D
+ self._CfgBlkDict['ver'] =3D Match.=
group(2=0D
+ =
)=0D
+=0D
+ elif Key =3D=3D 'SUBT':=0D
+ # GPIO_TMPL:1:2:3=0D
+ Remaining =3D Remaining.strip()=0D
+ Match =3D re.match("(\\w+)\\s*:", Rema=
ining)=0D
+ if Match:=0D
+ TemplateName =3D Match.group(1)=0D
+ for Line in self._BsfTempDict[=0D
+ TemplateName][::-1]:=0D
+ NewLine =3D self.SubtituteLine=
(=0D
+ Line, Remaining)=0D
+ DscLines.insert(0, NewLine)=0D
+ SkipLines +=3D 1=0D
+=0D
+ elif Key =3D=3D 'PAGES':=0D
+ # !BSF PAGES:{HSW:"Haswell System Agen=
t", \=0D
+ # LPT:"Lynx Point PCH"}=0D
+ PageList =3D Remaining.split(',')=0D
+ for Page in PageList:=0D
+ Page =3D Page.strip()=0D
+ Match =3D re.match('(\\w+):\=0D
+(\\w*:)?\\"(.+)\\"', Page)=0D
+ if Match:=0D
+ PageName =3D Match.group(1)=0D
+ ParentName =3D Match.group(2)=
=0D
+ if not ParentName or \=0D
+ ParentName =3D=3D ':':=0D
+ ParentName =3D 'root'=0D
+ else:=0D
+ ParentName =3D ParentName[=
:-1]=0D
+ if not self.AddBsfChildPage(=0D
+ PageName, ParentName):=
=0D
+ raise Exception("Cannot fi=
nd \=0D
+parent page '%s'!" % ParentName)=0D
+ self._CfgPageDict[=0D
+ PageName] =3D Match.group(=
3)=0D
+ else:=0D
+ raise Exception("Invalid page =
\=0D
+definitions '%s'!" % Page)=0D
+=0D
+ elif Key in ['NAME', 'HELP', 'OPTION'=0D
+ ] and Remaining.startswith('+=
'):=0D
+ # Allow certain options to be extended=
\=0D
+ # to multiple lines=0D
+ ConfigDict[Key.lower()] +=3D Remaining=
[1:]=0D
+=0D
+ else:=0D
+ if Key =3D=3D 'NAME':=0D
+ Remaining =3D Remaining.strip()=0D
+ elif Key =3D=3D 'CONDITION':=0D
+ Remaining =3D self.ExpandMacros(=0D
+ Remaining.strip())=0D
+ ConfigDict[Key.lower()] =3D Remaining=
=0D
+ else:=0D
+ Match =3D HdrRegExp.match(Remaining)=0D
+ if Match:=0D
+ Key =3D Match.group(1)=0D
+ Remaining =3D Match.group(2)=0D
+ if Key =3D=3D 'EMBED':=0D
+ Parts =3D Remaining.split(':')=0D
+ Names =3D Parts[0].split(',')=0D
+ DummyDict =3D ConfigDict.copy()=0D
+ if len(Names) > 1:=0D
+ Remaining =3D Names[0] + ':' + ':'.joi=
n(=0D
+ Parts[1:])=0D
+ DummyDict['struct'] =3D Names[1]=0D
+ else:=0D
+ DummyDict['struct'] =3D Names[0]=0D
+ DummyDict['cname'] =3D 'Dummy'=0D
+ DummyDict['name'] =3D ''=0D
+ DummyDict['embed'] =3D Remaining=0D
+ DummyDict['offset'] =3D Offset=0D
+ DummyDict['length'] =3D 0=0D
+ DummyDict['value'] =3D '0'=0D
+ DummyDict['type'] =3D 'Reserved'=0D
+ DummyDict['help'] =3D ''=0D
+ DummyDict['subreg'] =3D []=0D
+ self._CfgItemList.append(DummyDict)=0D
+ else:=0D
+ ConfigDict[Key.lower()] =3D Remaining=0D
+ # Check CFG line=0D
+ # gCfgData.VariableName | * | 0x01 | 0x1=0D
+ Clear =3D False=0D
+=0D
+ Match =3D TksRegExp.match(DscLine)=0D
+ if Match:=0D
+ DscLine =3D 'gCfgData.%s' % Match.group(2)=0D
+=0D
+ if DscLine.startswith('gCfgData.'):=0D
+ Match =3D CfgRegExp.match(DscLine[9:])=0D
+ else:=0D
+ Match =3D None=0D
+ if Match:=0D
+ ConfigDict['space'] =3D 'gCfgData'=0D
+ ConfigDict['cname'] =3D Match.group(1)=0D
+ if Match.group(2) !=3D '*':=0D
+ Offset =3D int(Match.group(2), 16)=0D
+ ConfigDict['offset'] =3D Offset=0D
+ ConfigDict['order'] =3D self.GetOrderNumber(=0D
+ ConfigDict['offset'], ConfigDict['order'])=0D
+=0D
+ Value =3D Match.group(4).strip()=0D
+ if Match.group(3).startswith("0x"):=0D
+ Length =3D int(Match.group(3), 16)=0D
+ else:=0D
+ Length =3D int(Match.group(3))=0D
+=0D
+ Offset +=3D Length=0D
+=0D
+ ConfigDict['length'] =3D Length=0D
+ Match =3D re.match("\\$\\((\\w+)\\)", Value)=0D
+ if Match:=0D
+ if Match.group(1) in self._MacroDict:=0D
+ Value =3D self._MacroDict[Match.group(1)]=0D
+=0D
+ ConfigDict['value'] =3D Value=0D
+ if re.match("\\{\\s*FILE:(.+)\\}", Value):=0D
+ # Expand embedded binary file=0D
+ ValArray =3D self.ValueToByteArray(ConfigDict['val=
ue'],=0D
+ ConfigDict['lengt=
h'])=0D
+ NewValue =3D Bytes2Str(ValArray)=0D
+ self._DscLines[-1] =3D re.sub(r'(.*)(\{\s*FILE:.+\=
})',=0D
+ r'\1 %s' % NewValue,=0D
+ self._DscLines[-1])=0D
+ ConfigDict['value'] =3D NewValue=0D
+=0D
+ if ConfigDict['name'] =3D=3D '':=0D
+ # Clear BSF specific items=0D
+ ConfigDict['bsfname'] =3D ''=0D
+ ConfigDict['help'] =3D ''=0D
+ ConfigDict['type'] =3D ''=0D
+ ConfigDict['option'] =3D ''=0D
+=0D
+ self.CfgDuplicationCheck(CfgDict, ConfigDict['cname'])=
=0D
+ self._CfgItemList.append(ConfigDict.copy())=0D
+ Clear =3D True=0D
+=0D
+ else:=0D
+ # It could be a virtual item as below=0D
+ # !BSF FIELD:{SerialDebugPortAddress0:1}=0D
+ # or=0D
+ # @Bsf FIELD:{SerialDebugPortAddress0:1b}=0D
+ Match =3D re.match(r"^\s*#\s+(!BSF)\s+FIELD:{(.+)}", D=
scLine)=0D
+ if Match:=0D
+ BitFieldTxt =3D Match.group(2)=0D
+ Match =3D re.match("(.+):(\\d+)b([BWDQ])?", BitFie=
ldTxt)=0D
+ if not Match:=0D
+ raise Exception("Incorrect bit field \=0D
+format '%s' !" % BitFieldTxt)=0D
+ UnitBitLen =3D 1=0D
+ SubCfgDict =3D ConfigDict.copy()=0D
+ SubCfgDict['cname'] =3D Match.group(1)=0D
+ SubCfgDict['bitlength'] =3D int(=0D
+ Match.group(2)) * UnitBitLen=0D
+ if SubCfgDict['bitlength'] > 0:=0D
+ LastItem =3D self._CfgItemList[-1]=0D
+ if len(LastItem['subreg']) =3D=3D 0:=0D
+ SubOffset =3D 0=0D
+ else:=0D
+ SubOffset =3D \=0D
+ LastItem['subreg'][-1]['bitoffse=
t'] \=0D
+ + LastItem['subreg'][-1]['bitlen=
gth']=0D
+ if Match.group(3) =3D=3D 'B':=0D
+ SubCfgDict['bitunit'] =3D 1=0D
+ elif Match.group(3) =3D=3D 'W':=0D
+ SubCfgDict['bitunit'] =3D 2=0D
+ elif Match.group(3) =3D=3D 'Q':=0D
+ SubCfgDict['bitunit'] =3D 8=0D
+ else:=0D
+ SubCfgDict['bitunit'] =3D 4=0D
+ SubCfgDict['bitoffset'] =3D SubOffset=0D
+ SubCfgDict['order'] =3D self.GetOrderNumber(=0D
+ SubCfgDict['offset'], SubCfgDict['order'],=
=0D
+ SubOffset)=0D
+ SubCfgDict['value'] =3D ''=0D
+ SubCfgDict['cname'] =3D '%s_%s' % (LastItem['c=
name'],=0D
+ Match.group(1=
))=0D
+ self.CfgDuplicationCheck(CfgDict,=0D
+ SubCfgDict['cname'])=
=0D
+ LastItem['subreg'].append(SubCfgDict.copy())=0D
+ Clear =3D True=0D
+=0D
+ if Clear:=0D
+ ConfigDict['name'] =3D ''=0D
+ ConfigDict['find'] =3D ''=0D
+ ConfigDict['struct'] =3D ''=0D
+ ConfigDict['embed'] =3D ''=0D
+ ConfigDict['marker'] =3D ''=0D
+ ConfigDict['comment'] =3D ''=0D
+ ConfigDict['order'] =3D -1=0D
+ ConfigDict['subreg'] =3D []=0D
+ ConfigDict['option'] =3D ''=0D
+ ConfigDict['condition'] =3D ''=0D
+=0D
+ return Error=0D
+=0D
+ def GetBsfBitFields(self, subitem, bytes):=0D
+ start =3D subitem['bitoffset']=0D
+ end =3D start + subitem['bitlength']=0D
+ bitsvalue =3D ''.join('{0:08b}'.format(i) for i in bytes[::-1])=0D
+ bitsvalue =3D bitsvalue[::-1]=0D
+ bitslen =3D len(bitsvalue)=0D
+ if start > bitslen or end > bitslen:=0D
+ raise Exception("Invalid bits offset [%d,%d] %d for %s" %=0D
+ (start, end, bitslen, subitem['name']))=0D
+ return '0x%X' % (int(bitsvalue[start:end][::-1], 2))=0D
+=0D
+ def UpdateBsfBitFields(self, SubItem, NewValue, ValueArray):=0D
+ Start =3D SubItem['bitoffset']=0D
+ End =3D Start + SubItem['bitlength']=0D
+ Blen =3D len(ValueArray)=0D
+ BitsValue =3D ''.join('{0:08b}'.format(i) for i in ValueArray[::-1=
])=0D
+ BitsValue =3D BitsValue[::-1]=0D
+ BitsLen =3D len(BitsValue)=0D
+ if Start > BitsLen or End > BitsLen:=0D
+ raise Exception("Invalid bits offset [%d,%d] %d for %s" %=0D
+ (Start, End, BitsLen, SubItem['name']))=0D
+ BitsValue =3D BitsValue[:Start] + '{0:0{1}b}'.format(=0D
+ NewValue, SubItem['bitlength'])[::-1] + BitsValue[End:]=0D
+ ValueArray[:] =3D bytearray.fromhex(=0D
+ '{0:0{1}x}'.format(int(BitsValue[::-1], 2), Blen * 2))[::-1]=0D
+=0D
+ def CreateVarDict(self):=0D
+ Error =3D 0=0D
+ self._VarDict =3D {}=0D
+ if len(self._CfgItemList) > 0:=0D
+ Item =3D self._CfgItemList[-1]=0D
+ self._VarDict['_LENGTH_'] =3D '%d' % (Item['offset'] +=0D
+ Item['length'])=0D
+ for Item in self._CfgItemList:=0D
+ Embed =3D Item['embed']=0D
+ Match =3D re.match("^(\\w+):(\\w+):(START|END)", Embed)=0D
+ if Match:=0D
+ StructName =3D Match.group(1)=0D
+ VarName =3D '_%s_%s_' % (Match.group(3), StructName)=0D
+ if Match.group(3) =3D=3D 'END':=0D
+ self._VarDict[VarName] =3D Item['offset'] + Item['leng=
th']=0D
+ self._VarDict['_LENGTH_%s_' % StructName] =3D \=0D
+ self._VarDict['_END_%s_' % StructName] - \=0D
+ self._VarDict['_START_%s_' % StructName]=0D
+ if Match.group(2).startswith('TAG_'):=0D
+ if (self.Mode !=3D 'FSP') and (self._VarDict=0D
+ ['_LENGTH_%s_' %=0D
+ StructName] % 4):=0D
+ raise Exception("Size of structure '%s' is %d,=
\=0D
+not DWORD aligned !" % (StructName, self._VarDict['_LENGTH_%s_' % StructNa=
me]))=0D
+ self._VarDict['_TAG_%s_' % StructName] =3D int(=0D
+ Match.group(2)[4:], 16) & 0xFFF=0D
+ else:=0D
+ self._VarDict[VarName] =3D Item['offset']=0D
+ if Item['marker']:=0D
+ self._VarDict['_OFFSET_%s_' % Item['marker'].strip()] =3D =
\=0D
+ Item['offset']=0D
+ return Error=0D
+=0D
+ def UpdateBsfBitUnit(self, Item):=0D
+ BitTotal =3D 0=0D
+ BitOffset =3D 0=0D
+ StartIdx =3D 0=0D
+ Unit =3D None=0D
+ UnitDec =3D {1: 'BYTE', 2: 'WORD', 4: 'DWORD', 8: 'QWORD'}=0D
+ for Idx, SubItem in enumerate(Item['subreg']):=0D
+ if Unit is None:=0D
+ Unit =3D SubItem['bitunit']=0D
+ BitLength =3D SubItem['bitlength']=0D
+ BitTotal +=3D BitLength=0D
+ BitOffset +=3D BitLength=0D
+=0D
+ if BitOffset > 64 or BitOffset > Unit * 8:=0D
+ break=0D
+=0D
+ if BitOffset =3D=3D Unit * 8:=0D
+ for SubIdx in range(StartIdx, Idx + 1):=0D
+ Item['subreg'][SubIdx]['bitunit'] =3D Unit=0D
+ BitOffset =3D 0=0D
+ StartIdx =3D Idx + 1=0D
+ Unit =3D None=0D
+=0D
+ if BitOffset > 0:=0D
+ raise Exception("Bit fields cannot fit into %s for \=0D
+'%s.%s' !" % (UnitDec[Unit], Item['cname'], SubItem['cname']))=0D
+=0D
+ ExpectedTotal =3D Item['length'] * 8=0D
+ if Item['length'] * 8 !=3D BitTotal:=0D
+ raise Exception("Bit fields total length (%d) does not match \=
=0D
+length (%d) of '%s' !" % (BitTotal, ExpectedTotal, Item['cname']))=0D
+=0D
+ def UpdateDefaultValue(self):=0D
+ Error =3D 0=0D
+ for Idx, Item in enumerate(self._CfgItemList):=0D
+ if len(Item['subreg']) =3D=3D 0:=0D
+ Value =3D Item['value']=0D
+ if (len(Value) > 0) and (Value[0] =3D=3D '{' or Value[0] =
=3D=3D "'" or=0D
+ Value[0] =3D=3D '"'):=0D
+ # {XXX} or 'XXX' strings=0D
+ self.FormatListValue(self._CfgItemList[Idx])=0D
+ else:=0D
+ Match =3D re.match("(0x[0-9a-fA-F]+|[0-9]+)", Value)=0D
+ if not Match:=0D
+ NumValue =3D self.EvaluateExpress(Value)=0D
+ Item['value'] =3D '0x%X' % NumValue=0D
+ else:=0D
+ ValArray =3D self.ValueToByteArray(Item['value'], Item['le=
ngth'])=0D
+ for SubItem in Item['subreg']:=0D
+ SubItem['value'] =3D self.GetBsfBitFields(SubItem, Val=
Array)=0D
+ self.UpdateBsfBitUnit(Item)=0D
+ return Error=0D
+=0D
+ @staticmethod=0D
+ def ExpandIncludeFiles(FilePath, CurDir=3D''):=0D
+ if CurDir =3D=3D '':=0D
+ CurDir =3D os.path.dirname(FilePath)=0D
+ FilePath =3D os.path.basename(FilePath)=0D
+=0D
+ InputFilePath =3D os.path.join(CurDir, FilePath)=0D
+ File =3D open(InputFilePath, "r")=0D
+ Lines =3D File.readlines()=0D
+ File.close()=0D
+=0D
+ NewLines =3D []=0D
+ for LineNum, Line in enumerate(Lines):=0D
+ Match =3D re.match("^!include\\s*(.+)?$", Line)=0D
+ if Match:=0D
+ IncPath =3D Match.group(1)=0D
+ TmpPath =3D os.path.join(CurDir, IncPath)=0D
+ OrgPath =3D TmpPath=0D
+ if not os.path.exists(TmpPath):=0D
+ CurDir =3D os.path.join(os.path.dirname(=0D
+ os.path.realpath(__file__)), "..", "..")=0D
+ TmpPath =3D os.path.join(CurDir, IncPath)=0D
+ if not os.path.exists(TmpPath):=0D
+ raise Exception("ERROR: Cannot open include file '%s'.=
" %=0D
+ OrgPath)=0D
+ else:=0D
+ NewLines.append(('# Included from file: %s\n' %=0D
+ IncPath, TmpPath, 0))=0D
+ NewLines.append(('# %s\n' % ('=3D' * 80), TmpPath, 0))=
=0D
+ NewLines.extend(CGenCfgData.ExpandIncludeFiles=0D
+ (IncPath, CurDir))=0D
+ else:=0D
+ NewLines.append((Line, InputFilePath, LineNum))=0D
+=0D
+ return NewLines=0D
+=0D
+ def OverrideDefaultValue(self, DltFile):=0D
+ Error =3D 0=0D
+ DltLines =3D CGenCfgData.ExpandIncludeFiles(DltFile)=0D
+=0D
+ PlatformId =3D None=0D
+ for Line, FilePath, LineNum in DltLines:=0D
+ Line =3D Line.strip()=0D
+ if not Line or Line.startswith('#'):=0D
+ continue=0D
+ Match =3D re.match("\\s*(\\w+)\\.(\\w+)(\\.\\w+)?\\s*\\|\\s*(.=
+)",=0D
+ Line)=0D
+ if not Match:=0D
+ raise Exception("Unrecognized line '%s' (File:'%s' Line:%d=
) !"=0D
+ % (Line, FilePath, LineNum + 1))=0D
+=0D
+ Found =3D False=0D
+ InScope =3D False=0D
+ for Idx, Item in enumerate(self._CfgItemList):=0D
+ if not InScope:=0D
+ if not (Item['embed'].endswith(':START') and=0D
+ Item['embed'].startswith(Match.group(1))):=0D
+ continue=0D
+ InScope =3D True=0D
+ if Item['cname'] =3D=3D Match.group(2):=0D
+ Found =3D True=0D
+ break=0D
+ if Item['embed'].endswith(':END') and \=0D
+ Item['embed'].startswith(Match.group(1)):=0D
+ break=0D
+ Name =3D '%s.%s' % (Match.group(1), Match.group(2))=0D
+ if not Found:=0D
+ ErrItem =3D Match.group(2) if InScope else Match.group(1)=
=0D
+ raise Exception("Invalid configuration '%s' in '%s' \=0D
+(File:'%s' Line:%d) !" % (ErrItem, Name, FilePath, LineNum + 1))=0D
+=0D
+ ValueStr =3D Match.group(4).strip()=0D
+ if Match.group(3) is not None:=0D
+ # This is a subregion item=0D
+ BitField =3D Match.group(3)[1:]=0D
+ Found =3D False=0D
+ if len(Item['subreg']) > 0:=0D
+ for SubItem in Item['subreg']:=0D
+ if SubItem['cname'] =3D=3D '%s_%s' % \=0D
+ (Item['cname'], BitField):=0D
+ Found =3D True=0D
+ break=0D
+ if not Found:=0D
+ raise Exception("Invalid configuration bit field \=0D
+'%s' in '%s.%s' (File:'%s' Line:%d) !" % (BitField, Name, BitField,=0D
+ FilePath, LineNum + 1))=0D
+=0D
+ try:=0D
+ Value =3D int(ValueStr, 16) if ValueStr.startswith('0x=
') \=0D
+ else int(ValueStr, 10)=0D
+ except Exception:=0D
+ raise Exception("Invalid value '%s' for bit field '%s.=
%s' \=0D
+(File:'%s' Line:%d) !" % (ValueStr, Name, BitField, FilePath, LineNum + 1)=
)=0D
+=0D
+ if Value >=3D 2 ** SubItem['bitlength']:=0D
+ raise Exception("Invalid configuration bit field value=
\=0D
+'%s' for '%s.%s' (File:'%s' Line:%d) !" % (Value, Name, BitField,=0D
+ FilePath, LineNum + 1))=0D
+=0D
+ ValArray =3D self.ValueToByteArray(Item['value'], Item['le=
ngth'])=0D
+ self.UpdateBsfBitFields(SubItem, Value, ValArray)=0D
+=0D
+ if Item['value'].startswith('{'):=0D
+ Item['value'] =3D '{' + ', '.join('0x%02X' % i=0D
+ for i in ValArray) + '=
}'=0D
+ else:=0D
+ BitsValue =3D ''.join('{0:08b}'.format(i)=0D
+ for i in ValArray[::-1])=0D
+ Item['value'] =3D '0x%X' % (int(BitsValue, 2))=0D
+ else:=0D
+ if Item['value'].startswith('{') and \=0D
+ not ValueStr.startswith('{'):=0D
+ raise Exception("Data array required for '%s' \=0D
+(File:'%s' Line:%d) !" % (Name, FilePath, LineNum + 1))=0D
+ Item['value'] =3D ValueStr=0D
+=0D
+ if Name =3D=3D 'PLATFORMID_CFG_DATA.PlatformId':=0D
+ PlatformId =3D ValueStr=0D
+=0D
+ if (PlatformId is None) and (self.Mode !=3D 'FSP'):=0D
+ raise Exception("PLATFORMID_CFG_DATA.PlatformId is missing=
\=0D
+in file '%s' !" % (DltFile))=0D
+=0D
+ return Error=0D
+=0D
+ def ProcessMultilines(self, String, MaxCharLength):=0D
+ Multilines =3D ''=0D
+ StringLength =3D len(String)=0D
+ CurrentStringStart =3D 0=0D
+ StringOffset =3D 0=0D
+ BreakLineDict =3D []=0D
+ if len(String) <=3D MaxCharLength:=0D
+ while (StringOffset < StringLength):=0D
+ if StringOffset >=3D 1:=0D
+ if String[StringOffset - 1] =3D=3D '\\' and \=0D
+ String[StringOffset] =3D=3D 'n':=0D
+ BreakLineDict.append(StringOffset + 1)=0D
+ StringOffset +=3D 1=0D
+ if BreakLineDict !=3D []:=0D
+ for Each in BreakLineDict:=0D
+ Multilines +=3D " %s\n" % String[CurrentStringStart:E=
ach].\=0D
+ lstrip()=0D
+ CurrentStringStart =3D Each=0D
+ if StringLength - CurrentStringStart > 0:=0D
+ Multilines +=3D " %s\n" % String[CurrentStringStart:]=
.\=0D
+ lstrip()=0D
+ else:=0D
+ Multilines =3D " %s\n" % String=0D
+ else:=0D
+ NewLineStart =3D 0=0D
+ NewLineCount =3D 0=0D
+ FoundSpaceChar =3D False=0D
+ while(StringOffset < StringLength):=0D
+ if StringOffset >=3D 1:=0D
+ if NewLineCount >=3D MaxCharLength - 1:=0D
+ if String[StringOffset] =3D=3D ' ' and \=0D
+ StringLength - StringOffset > 10:=0D
+ BreakLineDict.append(NewLineStart + NewLineCou=
nt)=0D
+ NewLineStart =3D NewLineStart + NewLineCount=0D
+ NewLineCount =3D 0=0D
+ FoundSpaceChar =3D True=0D
+ elif StringOffset =3D=3D StringLength - 1 \=0D
+ and FoundSpaceChar is False:=0D
+ BreakLineDict.append(0)=0D
+ if String[StringOffset - 1] =3D=3D '\\' and \=0D
+ String[StringOffset] =3D=3D 'n':=0D
+ BreakLineDict.append(StringOffset + 1)=0D
+ NewLineStart =3D StringOffset + 1=0D
+ NewLineCount =3D 0=0D
+ StringOffset +=3D 1=0D
+ NewLineCount +=3D 1=0D
+ if BreakLineDict !=3D []:=0D
+ BreakLineDict.sort()=0D
+ for Each in BreakLineDict:=0D
+ if Each > 0:=0D
+ Multilines +=3D " %s\n" % String[=0D
+ CurrentStringStart:Each].lstrip()=0D
+ CurrentStringStart =3D Each=0D
+ if StringLength - CurrentStringStart > 0:=0D
+ Multilines +=3D " %s\n" % String[CurrentStringStart:]=
.\=0D
+ lstrip()=0D
+ return Multilines=0D
+=0D
+ def CreateField(self, Item, Name, Length, Offset, Struct,=0D
+ BsfName, Help, Option, BitsLength=3DNone):=0D
+ PosName =3D 28=0D
+ NameLine =3D ''=0D
+ HelpLine =3D ''=0D
+ OptionLine =3D ''=0D
+=0D
+ if Length =3D=3D 0 and Name =3D=3D 'Dummy':=0D
+ return '\n'=0D
+=0D
+ IsArray =3D False=0D
+ if Length in [1, 2, 4, 8]:=0D
+ Type =3D "UINT%d" % (Length * 8)=0D
+ else:=0D
+ IsArray =3D True=0D
+ Type =3D "UINT8"=0D
+=0D
+ if Item and Item['value'].startswith('{'):=0D
+ Type =3D "UINT8"=0D
+ IsArray =3D True=0D
+=0D
+ if Struct !=3D '':=0D
+ Type =3D Struct=0D
+ if Struct in ['UINT8', 'UINT16', 'UINT32', 'UINT64']:=0D
+ IsArray =3D True=0D
+ Unit =3D int(Type[4:]) // 8=0D
+ Length =3D Length / Unit=0D
+ else:=0D
+ IsArray =3D False=0D
+=0D
+ if IsArray:=0D
+ Name =3D Name + '[%d]' % Length=0D
+=0D
+ if len(Type) < PosName:=0D
+ Space1 =3D PosName - len(Type)=0D
+ else:=0D
+ Space1 =3D 1=0D
+=0D
+ if BsfName !=3D '':=0D
+ NameLine =3D " %s\n" % BsfName=0D
+ else:=0D
+ NameLine =3D "\n"=0D
+=0D
+ if Help !=3D '':=0D
+ HelpLine =3D self.ProcessMultilines(Help, 80)=0D
+=0D
+ if Option !=3D '':=0D
+ OptionLine =3D self.ProcessMultilines(Option, 80)=0D
+=0D
+ if BitsLength is None:=0D
+ BitsLength =3D ''=0D
+ else:=0D
+ BitsLength =3D ' : %d' % BitsLength=0D
+=0D
+ return "\n/** %s%s%s**/\n %s%s%s%s;\n" % \=0D
+ (NameLine, HelpLine, OptionLine, Type, ' ' * Space1, Name,=
=0D
+ BitsLength)=0D
+=0D
+ def SplitTextBody(self, TextBody):=0D
+ Marker1 =3D '{ /* _COMMON_STRUCT_START_ */'=0D
+ Marker2 =3D '; /* _COMMON_STRUCT_END_ */'=0D
+ ComBody =3D []=0D
+ TxtBody =3D []=0D
+ IsCommon =3D False=0D
+ for Line in TextBody:=0D
+ if Line.strip().endswith(Marker1):=0D
+ Line =3D Line.replace(Marker1[1:], '')=0D
+ IsCommon =3D True=0D
+ if Line.strip().endswith(Marker2):=0D
+ Line =3D Line.replace(Marker2[1:], '')=0D
+ if IsCommon:=0D
+ ComBody.append(Line)=0D
+ IsCommon =3D False=0D
+ continue=0D
+ if IsCommon:=0D
+ ComBody.append(Line)=0D
+ else:=0D
+ TxtBody.append(Line)=0D
+ return ComBody, TxtBody=0D
+=0D
+ def GetStructArrayInfo(self, Input):=0D
+ ArrayStr =3D Input.split('[')=0D
+ Name =3D ArrayStr[0]=0D
+ if len(ArrayStr) > 1:=0D
+ NumStr =3D ''.join(c for c in ArrayStr[-1] if c.isdigit())=0D
+ NumStr =3D '1000' if len(NumStr) =3D=3D 0 else NumStr=0D
+ ArrayNum =3D int(NumStr)=0D
+ else:=0D
+ ArrayNum =3D 0=0D
+ return Name, ArrayNum=0D
+=0D
+ def PostProcessBody(self, TextBody, IncludeEmbedOnly=3DTrue):=0D
+ NewTextBody =3D []=0D
+ OldTextBody =3D []=0D
+ IncTextBody =3D []=0D
+ StructBody =3D []=0D
+ IncludeLine =3D False=0D
+ EmbedFound =3D False=0D
+ StructName =3D ''=0D
+ ArrayVarName =3D ''=0D
+ VariableName =3D ''=0D
+ Count =3D 0=0D
+ Level =3D 0=0D
+ IsCommonStruct =3D False=0D
+=0D
+ for Line in TextBody:=0D
+ if Line.startswith('#define '):=0D
+ IncTextBody.append(Line)=0D
+ continue=0D
+=0D
+ if not Line.startswith('/* EMBED_STRUCT:'):=0D
+ Match =3D False=0D
+ else:=0D
+ Match =3D re.match("^/\\*\\sEMBED_STRUCT:([\\w\\[\\]\\*]+)=
:\=0D
+([\\w\\[\\]\\*]+):(\\w+):(START|END)([\\s\\d]+)\\*/([\\s\\S]*)", Line)=0D
+=0D
+ if Match:=0D
+ ArrayMarker =3D Match.group(5)=0D
+ if Match.group(4) =3D=3D 'END':=0D
+ Level -=3D 1=0D
+ if Level =3D=3D 0:=0D
+ Line =3D Match.group(6)=0D
+ else: # 'START'=0D
+ Level +=3D 1=0D
+ if Level =3D=3D 1:=0D
+ Line =3D Match.group(6)=0D
+ else:=0D
+ EmbedFound =3D True=0D
+ TagStr =3D Match.group(3)=0D
+ if TagStr.startswith('TAG_'):=0D
+ try:=0D
+ TagVal =3D int(TagStr[4:], 16)=0D
+ except Exception:=0D
+ TagVal =3D -1=0D
+ if (TagVal >=3D 0) and (TagVal < self._MinCfgTagId=
):=0D
+ IsCommonStruct =3D True=0D
+=0D
+ if Level =3D=3D 1:=0D
+ if IsCommonStruct:=0D
+ Suffix =3D ' /* _COMMON_STRUCT_START_ */'=0D
+ else:=0D
+ Suffix =3D ''=0D
+ StructBody =3D ['typedef struct {%s' % Suffix]=0D
+ StructName =3D Match.group(1)=0D
+ StructType =3D Match.group(2)=0D
+ VariableName =3D Match.group(3)=0D
+ MatchOffset =3D re.search('/\\*\\*\\sOffset\\s0x\=
=0D
+([a-fA-F0-9]+)', Line)=0D
+ if MatchOffset:=0D
+ Offset =3D int(MatchOffset.group(1), 16)=0D
+ else:=0D
+ Offset =3D None=0D
+ IncludeLine =3D True=0D
+=0D
+ ModifiedStructType =3D StructType.rstrip()=0D
+ if ModifiedStructType.endswith(']'):=0D
+ Idx =3D ModifiedStructType.index('[')=0D
+ if ArrayMarker !=3D ' ':=0D
+ # Auto array size=0D
+ OldTextBody.append('')=0D
+ ArrayVarName =3D VariableName=0D
+ if int(ArrayMarker) =3D=3D 1000:=0D
+ Count =3D 1=0D
+ else:=0D
+ Count =3D int(ArrayMarker) + 1000=0D
+ else:=0D
+ if Count < 1000:=0D
+ Count +=3D 1=0D
+=0D
+ VariableTemp =3D ArrayVarName + '[%d]' % (=0D
+ Count if Count < 1000 else Count - 1000)=0D
+ OldTextBody[-1] =3D self.CreateField(=0D
+ None, VariableTemp, 0, Offset,=0D
+ ModifiedStructType[:Idx], '',=0D
+ 'Structure Array', '')=0D
+ else:=0D
+ ArrayVarName =3D ''=0D
+ OldTextBody.append(self.CreateField(=0D
+ None, VariableName, 0, Offset,=0D
+ ModifiedStructType, '', '', ''))=0D
+=0D
+ if IncludeLine:=0D
+ StructBody.append(Line)=0D
+ else:=0D
+ OldTextBody.append(Line)=0D
+=0D
+ if Match and Match.group(4) =3D=3D 'END':=0D
+ if Level =3D=3D 0:=0D
+ if (StructType !=3D Match.group(2)) or \=0D
+ (VariableName !=3D Match.group(3)):=0D
+ print("Unmatched struct name '%s' and '%s' !" %=0D
+ (StructName, Match.group(2)))=0D
+ else:=0D
+ if IsCommonStruct:=0D
+ Suffix =3D ' /* _COMMON_STRUCT_END_ */'=0D
+ else:=0D
+ Suffix =3D ''=0D
+ Line =3D '} %s;%s\n\n\n' % (StructName, Suffix)=0D
+ StructBody.append(Line)=0D
+ if (Line not in NewTextBody) and \=0D
+ (Line not in OldTextBody):=0D
+ NewTextBody.extend(StructBody)=0D
+ IncludeLine =3D False=0D
+ IsCommonStruct =3D False=0D
+=0D
+ if not IncludeEmbedOnly:=0D
+ NewTextBody.extend(OldTextBody)=0D
+=0D
+ if EmbedFound:=0D
+ NewTextBody =3D self.PostProcessBody(NewTextBody, False)=0D
+=0D
+ NewTextBody =3D IncTextBody + NewTextBody=0D
+ return NewTextBody=0D
+=0D
+ def WriteHeaderFile(self, TxtBody, FileName, Type=3D'h'):=0D
+ FileNameDef =3D os.path.basename(FileName).replace('.', '_')=0D
+ FileNameDef =3D re.sub('(.)([A-Z][a-z]+)', r'\1_\2', FileNameDef)=
=0D
+ FileNameDef =3D re.sub('([a-z0-9])([A-Z])', r'\1_\2',=0D
+ FileNameDef).upper()=0D
+=0D
+ Lines =3D []=0D
+ Lines.append("%s\n" % GetCopyrightHeader(Type))=0D
+ Lines.append("#ifndef __%s__\n" % FileNameDef)=0D
+ Lines.append("#define __%s__\n\n" % FileNameDef)=0D
+ if Type =3D=3D 'h':=0D
+ Lines.append("#pragma pack(1)\n\n")=0D
+ Lines.extend(TxtBody)=0D
+ if Type =3D=3D 'h':=0D
+ Lines.append("#pragma pack()\n\n")=0D
+ Lines.append("#endif\n")=0D
+=0D
+ # Don't rewrite if the contents are the same=0D
+ Create =3D True=0D
+ if os.path.exists(FileName):=0D
+ HdrFile =3D open(FileName, "r")=0D
+ OrgTxt =3D HdrFile.read()=0D
+ HdrFile.close()=0D
+=0D
+ NewTxt =3D ''.join(Lines)=0D
+ if OrgTxt =3D=3D NewTxt:=0D
+ Create =3D False=0D
+=0D
+ if Create:=0D
+ HdrFile =3D open(FileName, "w")=0D
+ HdrFile.write(''.join(Lines))=0D
+ HdrFile.close()=0D
+=0D
+ def CreateHeaderFile(self, HdrFileName, ComHdrFileName=3D''):=0D
+ LastStruct =3D ''=0D
+ SpaceIdx =3D 0=0D
+ Offset =3D 0=0D
+ FieldIdx =3D 0=0D
+ LastFieldIdx =3D 0=0D
+ ResvOffset =3D 0=0D
+ ResvIdx =3D 0=0D
+ TxtBody =3D []=0D
+ LineBuffer =3D []=0D
+ CfgTags =3D []=0D
+ LastVisible =3D True=0D
+=0D
+ TxtBody.append("typedef struct {\n")=0D
+ for Item in self._CfgItemList:=0D
+ # Search for CFGDATA tags=0D
+ Embed =3D Item["embed"].upper()=0D
+ if Embed.endswith(':START'):=0D
+ Match =3D re.match(r'(\w+)_CFG_DATA:TAG_([0-9A-F]+):START'=
,=0D
+ Embed)=0D
+ if Match:=0D
+ TagName =3D Match.group(1)=0D
+ TagId =3D int(Match.group(2), 16)=0D
+ CfgTags.append((TagId, TagName))=0D
+=0D
+ # Only process visible items=0D
+ NextVisible =3D LastVisible=0D
+=0D
+ if LastVisible and (Item['header'] =3D=3D 'OFF'):=0D
+ NextVisible =3D False=0D
+ ResvOffset =3D Item['offset']=0D
+ elif (not LastVisible) and Item['header'] =3D=3D 'ON':=0D
+ NextVisible =3D True=0D
+ Name =3D "ReservedUpdSpace%d" % ResvIdx=0D
+ ResvIdx =3D ResvIdx + 1=0D
+ TxtBody.append(self.CreateField(=0D
+ Item, Name, Item["offset"] - ResvOffset,=0D
+ ResvOffset, '', '', '', ''))=0D
+ FieldIdx +=3D 1=0D
+=0D
+ if Offset < Item["offset"]:=0D
+ if LastVisible:=0D
+ Name =3D "UnusedUpdSpace%d" % SpaceIdx=0D
+ LineBuffer.append(self.CreateField=0D
+ (Item, Name, Item["offset"] -=0D
+ Offset, Offset, '', '', '', ''))=0D
+ FieldIdx +=3D 1=0D
+ SpaceIdx =3D SpaceIdx + 1=0D
+ Offset =3D Item["offset"]=0D
+=0D
+ LastVisible =3D NextVisible=0D
+=0D
+ Offset =3D Offset + Item["length"]=0D
+ if LastVisible:=0D
+ for Each in LineBuffer:=0D
+ TxtBody.append(Each)=0D
+ LineBuffer =3D []=0D
+ Embed =3D Item["embed"].upper()=0D
+ if Embed.endswith(':START') or Embed.endswith(':END'):=0D
+ # EMBED_STRUCT: StructName : \=0D
+ # ItemName : VariableName : START|END=0D
+ Name, ArrayNum =3D self.GetStructArrayInfo(Item["struc=
t"])=0D
+ Remaining =3D Item["embed"]=0D
+ if (LastFieldIdx + 1 =3D=3D FieldIdx) and (LastStruct =
=3D=3D Name):=0D
+ ArrayMarker =3D ' '=0D
+ else:=0D
+ ArrayMarker =3D '%d' % ArrayNum=0D
+ LastFieldIdx =3D FieldIdx=0D
+ LastStruct =3D Name=0D
+ Marker =3D '/* EMBED_STRUCT:%s:%s%s*/ ' % (Name, Remai=
ning,=0D
+ ArrayMarker)=
=0D
+ # if Embed.endswith(':START') and Comment !=3D '':=0D
+ # Marker =3D '/* COMMENT:%s */ \n' % Item["comment"] +=
Marker=0D
+ else:=0D
+ if Embed =3D=3D '':=0D
+ Marker =3D ''=0D
+ else:=0D
+ self.Error =3D "Invalid embedded structure \=0D
+format '%s'!\n" % Item["embed"]=0D
+ return 4=0D
+=0D
+ # Generate bit fields for structure=0D
+ if len(Item['subreg']) > 0 and Item["struct"]:=0D
+ StructType =3D Item["struct"]=0D
+ StructName, ArrayNum =3D self.GetStructArrayInfo(Struc=
tType)=0D
+ if (LastFieldIdx + 1 =3D=3D FieldIdx) and \=0D
+ (LastStruct =3D=3D Item["struct"]):=0D
+ ArrayMarker =3D ' '=0D
+ else:=0D
+ ArrayMarker =3D '%d' % ArrayNum=0D
+ TxtBody.append('/* EMBED_STRUCT:%s:%s:%s:START%s*/\n' =
%=0D
+ (StructName, StructType, Item["cname"],=
=0D
+ ArrayMarker))=0D
+ for SubItem in Item['subreg']:=0D
+ Name =3D SubItem["cname"]=0D
+ if Name.startswith(Item["cname"]):=0D
+ Name =3D Name[len(Item["cname"]) + 1:]=0D
+ Line =3D self.CreateField(=0D
+ SubItem, Name, SubItem["bitunit"],=0D
+ SubItem["offset"], SubItem['struct'],=0D
+ SubItem['name'], SubItem['help'],=0D
+ SubItem['option'], SubItem['bitlength'])=0D
+ TxtBody.append(Line)=0D
+ TxtBody.append('/* EMBED_STRUCT:%s:%s:%s:END%s*/\n' %=
=0D
+ (StructName, StructType, Item["cname"],=
=0D
+ ArrayMarker))=0D
+ LastFieldIdx =3D FieldIdx=0D
+ LastStruct =3D Item["struct"]=0D
+ FieldIdx +=3D 1=0D
+ else:=0D
+ FieldIdx +=3D 1=0D
+ Line =3D Marker + self.CreateField(=0D
+ Item, Item["cname"], Item["length"], Item["offset"=
],=0D
+ Item['struct'], Item['name'], Item['help'],=0D
+ Item['option'])=0D
+ TxtBody.append(Line)=0D
+=0D
+ TxtBody.append("}\n\n")=0D
+=0D
+ # Handle the embedded data structure=0D
+ TxtBody =3D self.PostProcessBody(TxtBody)=0D
+ ComBody, TxtBody =3D self.SplitTextBody(TxtBody)=0D
+=0D
+ # Prepare TAG defines=0D
+ PltTagDefTxt =3D ['\n']=0D
+ ComTagDefTxt =3D ['\n']=0D
+ for TagId, TagName in sorted(CfgTags):=0D
+ TagLine =3D '#define %-30s 0x%03X\n' % ('CDATA_%s_TAG' %=0D
+ TagName, TagId)=0D
+ if TagId < self._MinCfgTagId:=0D
+ # TAG ID < 0x100, it is a generic TAG=0D
+ ComTagDefTxt.append(TagLine)=0D
+ else:=0D
+ PltTagDefTxt.append(TagLine)=0D
+ PltTagDefTxt.append('\n\n')=0D
+ ComTagDefTxt.append('\n\n')=0D
+=0D
+ # Write file back=0D
+ self.WriteHeaderFile(PltTagDefTxt + TxtBody, HdrFileName)=0D
+ if ComHdrFileName:=0D
+ self.WriteHeaderFile(ComTagDefTxt + ComBody, ComHdrFileName)=0D
+=0D
+ return 0=0D
+=0D
+ def UpdateConfigItemValue(self, Item, ValueStr):=0D
+ IsArray =3D True if Item['value'].startswith('{') else False=0D
+ IsString =3D True if Item['value'].startswith("'") else False=0D
+ Bytes =3D self.ValueToByteArray(ValueStr, Item['length'])=0D
+ if IsString:=0D
+ NewValue =3D "'%s'" % Bytes.decode("utf-8")=0D
+ elif IsArray:=0D
+ NewValue =3D Bytes2Str(Bytes)=0D
+ else:=0D
+ Fmt =3D '0x%X' if Item['value'].startswith('0x') else '%d'=0D
+ NewValue =3D Fmt % Bytes2Val(Bytes)=0D
+ Item['value'] =3D NewValue=0D
+=0D
+ def LoadDefaultFromBinaryArray(self, BinDat, IgnoreFind=3DFalse):=0D
+ FindOff =3D 0=0D
+ StartOff =3D 0=0D
+ for Item in self._CfgItemList:=0D
+ if Item['length'] =3D=3D 0:=0D
+ continue=0D
+ if not IgnoreFind and Item['find']:=0D
+ FindBin =3D Item['find'].encode()=0D
+ Offset =3D BinDat.find(FindBin)=0D
+ if Offset >=3D 0:=0D
+ TestOff =3D BinDat[Offset+len(FindBin):].find(FindBin)=
=0D
+ if TestOff >=3D 0:=0D
+ raise Exception('Multiple match found for "%s" !' =
%=0D
+ Item['find'])=0D
+ FindOff =3D Offset + len(FindBin)=0D
+ StartOff =3D Item['offset']=0D
+ else:=0D
+ raise Exception('Could not find "%s" !' % Item['find']=
)=0D
+ if Item['offset'] + Item['length'] > len(BinDat):=0D
+ raise Exception('Mismatching format between DSC \=0D
+and BIN files !')=0D
+ Offset =3D FindOff + (Item['offset'] - StartOff)=0D
+ ValStr =3D Bytes2Str(BinDat[Offset: Offset + Item['length']])=
=0D
+ self.UpdateConfigItemValue(Item, ValStr)=0D
+=0D
+ self.UpdateDefaultValue()=0D
+=0D
+ def PatchBinaryArray(self, BinDat):=0D
+ FileOff =3D 0=0D
+ Offset =3D 0=0D
+ FindOff =3D 0=0D
+=0D
+ PatchList =3D []=0D
+ CfgBin =3D bytearray()=0D
+ for Item in self._CfgItemList:=0D
+ if Item['length'] =3D=3D 0:=0D
+ continue=0D
+=0D
+ if Item['find']:=0D
+ if len(CfgBin) > 0:=0D
+ PatchList.append((FileOff, CfgBin))=0D
+ FindBin =3D Item['find'].encode()=0D
+ FileOff =3D BinDat.find(FindBin)=0D
+ if FileOff < 0:=0D
+ raise Exception('Could not find "%s" !' % Item['find']=
)=0D
+ else:=0D
+ TestOff =3D BinDat[FileOff+len(FindBin):].find(FindBin=
)=0D
+ if TestOff >=3D 0:=0D
+ raise Exception('Multiple match found for "%s" !' =
%=0D
+ Item['find'])=0D
+ FileOff +=3D len(FindBin)=0D
+ Offset =3D Item['offset']=0D
+ FindOff =3D Offset=0D
+ CfgBin =3D bytearray()=0D
+=0D
+ if Item['offset'] > Offset:=0D
+ Gap =3D Item['offset'] - Offset=0D
+ CfgBin.extend(b'\x00' * Gap)=0D
+=0D
+ if Item['type'] =3D=3D 'Reserved' and Item['option'] =3D=3D '$=
SKIP':=0D
+ # keep old data=0D
+ NewOff =3D FileOff + (Offset - FindOff)=0D
+ FileData =3D bytearray(BinDat[NewOff: NewOff + Item['lengt=
h']])=0D
+ CfgBin.extend(FileData)=0D
+ else:=0D
+ CfgBin.extend(self.ValueToByteArray(Item['value'],=0D
+ Item['length']))=0D
+ Offset =3D Item['offset'] + Item['length']=0D
+=0D
+ if len(CfgBin) > 0:=0D
+ PatchList.append((FileOff, CfgBin))=0D
+=0D
+ for FileOff, CfgBin in PatchList:=0D
+ Length =3D len(CfgBin)=0D
+ if FileOff + Length < len(BinDat):=0D
+ BinDat[FileOff:FileOff+Length] =3D CfgBin[:]=0D
+=0D
+ return BinDat=0D
+=0D
+ def GenerateBinaryArray(self):=0D
+ Offset =3D 0=0D
+ BinDat =3D bytearray()=0D
+ for Item in self._CfgItemList:=0D
+ if Item['offset'] > Offset:=0D
+ Gap =3D Item['offset'] - Offset=0D
+ BinDat.extend(b'\x00' * Gap)=0D
+ BinDat.extend(self.ValueToByteArray(Item['value'], Item['lengt=
h']))=0D
+ Offset =3D Item['offset'] + Item['length']=0D
+ return BinDat=0D
+=0D
+ def GenerateBinary(self, BinFileName):=0D
+ BinFile =3D open(BinFileName, "wb")=0D
+ BinFile.write(self.GenerateBinaryArray())=0D
+ BinFile.close()=0D
+ return 0=0D
+=0D
+ def GenerateDataIncFile(self, DatIncFileName, BinFile=3DNone):=0D
+ # Put a prefix GUID before CFGDATA so that it can be located later=
on=0D
+ Prefix =3D b'\xa7\xbd\x7f\x73\x20\x1e\x46\xd6\xbe\x8f\=0D
+x64\x12\x05\x8d\x0a\xa8'=0D
+ if BinFile:=0D
+ Fin =3D open(BinFile, 'rb')=0D
+ BinDat =3D Prefix + bytearray(Fin.read())=0D
+ Fin.close()=0D
+ else:=0D
+ BinDat =3D Prefix + self.GenerateBinaryArray()=0D
+=0D
+ FileName =3D os.path.basename(DatIncFileName).upper()=0D
+ FileName =3D FileName.replace('.', '_')=0D
+=0D
+ TxtLines =3D []=0D
+=0D
+ TxtLines.append("UINT8 mConfigDataBlob[%d] =3D {\n" % len(BinDat)=
)=0D
+ Count =3D 0=0D
+ Line =3D [' ']=0D
+ for Each in BinDat:=0D
+ Line.append('0x%02X, ' % Each)=0D
+ Count =3D Count + 1=0D
+ if (Count & 0x0F) =3D=3D 0:=0D
+ Line.append('\n')=0D
+ TxtLines.append(''.join(Line))=0D
+ Line =3D [' ']=0D
+ if len(Line) > 1:=0D
+ TxtLines.append(''.join(Line) + '\n')=0D
+=0D
+ TxtLines.append("};\n\n")=0D
+=0D
+ self.WriteHeaderFile(TxtLines, DatIncFileName, 'inc')=0D
+=0D
+ return 0=0D
+=0D
+ def CheckCfgData(self):=0D
+ # Check if CfgData contains any duplicated name=0D
+ def AddItem(Item, ChkList):=0D
+ Name =3D Item['cname']=0D
+ if Name in ChkList:=0D
+ return Item=0D
+ if Name not in ['Dummy', 'Reserved', 'CfgHeader', 'CondValue']=
:=0D
+ ChkList.append(Name)=0D
+ return None=0D
+=0D
+ Duplicate =3D None=0D
+ ChkList =3D []=0D
+ for Item in self._CfgItemList:=0D
+ Duplicate =3D AddItem(Item, ChkList)=0D
+ if not Duplicate:=0D
+ for SubItem in Item['subreg']:=0D
+ Duplicate =3D AddItem(SubItem, ChkList)=0D
+ if Duplicate:=0D
+ break=0D
+ if Duplicate:=0D
+ break=0D
+ if Duplicate:=0D
+ self.Error =3D "Duplicated CFGDATA '%s' found !\n" % \=0D
+ Duplicate['cname']=0D
+ return -1=0D
+ return 0=0D
+=0D
+ def PrintData(self):=0D
+ for Item in self._CfgItemList:=0D
+ if not Item['length']:=0D
+ continue=0D
+ print("%-10s @Offset:0x%04X Len:%3d Val:%s" %=0D
+ (Item['cname'], Item['offset'], Item['length'],=0D
+ Item['value']))=0D
+ for SubItem in Item['subreg']:=0D
+ print(" %-20s BitOff:0x%04X BitLen:%-3d Val:%s" %=0D
+ (SubItem['cname'], SubItem['bitoffset'],=0D
+ SubItem['bitlength'], SubItem['value']))=0D
+=0D
+ def FormatArrayValue(self, Input, Length):=0D
+ Dat =3D self.ValueToByteArray(Input, Length)=0D
+ return ','.join('0x%02X' % Each for Each in Dat)=0D
+=0D
+ def GetItemOptionList(self, Item):=0D
+ TmpList =3D []=0D
+ if Item['type'] =3D=3D "Combo":=0D
+ if not Item['option'] in self._BuidinOption:=0D
+ OptList =3D Item['option'].split(',')=0D
+ for Option in OptList:=0D
+ Option =3D Option.strip()=0D
+ try:=0D
+ (OpVal, OpStr) =3D Option.split(':')=0D
+ except Exception:=0D
+ raise Exception("Invalide option format '%s' !" %=
=0D
+ Option)=0D
+ TmpList.append((OpVal, OpStr))=0D
+ return TmpList=0D
+=0D
+ def WriteBsfStruct(self, BsfFd, Item):=0D
+ if Item['type'] =3D=3D "None":=0D
+ Space =3D "gPlatformFspPkgTokenSpaceGuid"=0D
+ else:=0D
+ Space =3D Item['space']=0D
+ Line =3D " $%s_%s" % (Space, Item['cname'])=0D
+ Match =3D re.match("\\s*(\\{.+\\})\\s*", Item['value'])=0D
+ if Match:=0D
+ DefaultValue =3D self.FormatArrayValue(Match.group(1).strip(),=
=0D
+ Item['length'])=0D
+ else:=0D
+ DefaultValue =3D Item['value'].strip()=0D
+ if 'bitlength' in Item:=0D
+ if Item['bitlength']:=0D
+ BsfFd.write(" %s%s%4d bits $_DEFAULT_ =3D %s\n" %=0D
+ (Line, ' ' * (64 - len(Line)), Item['bitlength=
'],=0D
+ DefaultValue))=0D
+ else:=0D
+ if Item['length']:=0D
+ BsfFd.write(" %s%s%4d bytes $_DEFAULT_ =3D %s\n" %=0D
+ (Line, ' ' * (64 - len(Line)), Item['length'],=
=0D
+ DefaultValue))=0D
+=0D
+ return self.GetItemOptionList(Item)=0D
+=0D
+ def GetBsfOption(self, OptionName):=0D
+ if OptionName in self._CfgOptsDict:=0D
+ return self._CfgOptsDict[OptionName]=0D
+ else:=0D
+ return OptionName=0D
+=0D
+ def WriteBsfOption(self, BsfFd, Item):=0D
+ PcdName =3D Item['space'] + '_' + Item['cname']=0D
+ WriteHelp =3D 0=0D
+ BsfLines =3D []=0D
+ if Item['type'] =3D=3D "Combo":=0D
+ if Item['option'] in self._BuidinOption:=0D
+ Options =3D self._BuidinOption[Item['option']]=0D
+ else:=0D
+ Options =3D self.GetBsfOption(PcdName)=0D
+ BsfLines.append(' %s $%s, "%s", &%s,\n' % (=0D
+ Item['type'], PcdName, Item['name'], Options))=0D
+ WriteHelp =3D 1=0D
+ elif Item['type'].startswith("EditNum"):=0D
+ Match =3D re.match("EditNum\\s*,\\s*(HEX|DEC)\\s*,\\s*\\(\=0D
+(\\d+|0x[0-9A-Fa-f]+)\\s*,\\s*(\\d+|0x[0-9A-Fa-f]+)\\)", Item['type'])=0D
+ if Match:=0D
+ BsfLines.append(' EditNum $%s, "%s", %s,\n' % (=0D
+ PcdName, Item['name'], Match.group(1)))=0D
+ WriteHelp =3D 2=0D
+ elif Item['type'].startswith("EditText"):=0D
+ BsfLines.append(' %s $%s, "%s",\n' % (Item['type'], PcdName=
,=0D
+ Item['name']))=0D
+ WriteHelp =3D 1=0D
+ elif Item['type'] =3D=3D "Table":=0D
+ Columns =3D Item['option'].split(',')=0D
+ if len(Columns) !=3D 0:=0D
+ BsfLines.append(' %s $%s "%s",' % (Item['type'], PcdNam=
e,=0D
+ Item['name']))=0D
+ for Col in Columns:=0D
+ Fmt =3D Col.split(':')=0D
+ if len(Fmt) !=3D 3:=0D
+ raise Exception("Column format '%s' is invalid !" =
%=0D
+ Fmt)=0D
+ try:=0D
+ Dtype =3D int(Fmt[1].strip())=0D
+ except Exception:=0D
+ raise Exception("Column size '%s' is invalid !" %=
=0D
+ Fmt[1])=0D
+ BsfLines.append('\n Column "%s", %d bytes, %s' =
%=0D
+ (Fmt[0].strip(), Dtype, Fmt[2].strip()=
))=0D
+ BsfLines.append(',\n')=0D
+ WriteHelp =3D 1=0D
+=0D
+ if WriteHelp > 0:=0D
+ HelpLines =3D Item['help'].split('\\n\\r')=0D
+ FirstLine =3D True=0D
+ for HelpLine in HelpLines:=0D
+ if FirstLine:=0D
+ FirstLine =3D False=0D
+ BsfLines.append(' Help "%s"\n' % (HelpLine))=0D
+ else:=0D
+ BsfLines.append(' "%s"\n' % (HelpLine))=0D
+ if WriteHelp =3D=3D 2:=0D
+ BsfLines.append(' "Valid range: %s ~ %s"\n' %=
=0D
+ (Match.group(2), Match.group(3)))=0D
+=0D
+ if len(Item['condition']) > 4:=0D
+ CondList =3D Item['condition'].split(',')=0D
+ Idx =3D 0=0D
+ for Cond in CondList:=0D
+ Cond =3D Cond.strip()=0D
+ if Cond.startswith('#'):=0D
+ BsfLines.insert(Idx, Cond + '\n')=0D
+ Idx +=3D 1=0D
+ elif Cond.startswith('@#'):=0D
+ BsfLines.append(Cond[1:] + '\n')=0D
+=0D
+ for Line in BsfLines:=0D
+ BsfFd.write(Line)=0D
+=0D
+ def WriteBsfPages(self, PageTree, BsfFd):=0D
+ BsfFd.write('\n')=0D
+ Key =3D next(iter(PageTree))=0D
+ for Page in PageTree[Key]:=0D
+ PageName =3D next(iter(Page))=0D
+ BsfFd.write('Page "%s"\n' % self._CfgPageDict[PageName])=0D
+ if len(PageTree[Key]):=0D
+ self.WriteBsfPages(Page, BsfFd)=0D
+=0D
+ BsfItems =3D []=0D
+ for Item in self._CfgItemList:=0D
+ if Item['name'] !=3D '':=0D
+ if Item['page'] !=3D PageName:=0D
+ continue=0D
+ if len(Item['subreg']) > 0:=0D
+ for SubItem in Item['subreg']:=0D
+ if SubItem['name'] !=3D '':=0D
+ BsfItems.append(SubItem)=0D
+ else:=0D
+ BsfItems.append(Item)=0D
+=0D
+ BsfItems.sort(key=3Dlambda x: x['order'])=0D
+=0D
+ for Item in BsfItems:=0D
+ self.WriteBsfOption(BsfFd, Item)=0D
+ BsfFd.write("EndPage\n\n")=0D
+=0D
+ def GenerateBsfFile(self, BsfFile):=0D
+=0D
+ if BsfFile =3D=3D '':=0D
+ self.Error =3D "BSF output file '%s' is invalid" % BsfFile=0D
+ return 1=0D
+=0D
+ Error =3D 0=0D
+ OptionDict =3D {}=0D
+ BsfFd =3D open(BsfFile, "w")=0D
+ BsfFd.write("%s\n" % GetCopyrightHeader('bsf'))=0D
+ BsfFd.write("%s\n" % self._GlobalDataDef)=0D
+ BsfFd.write("StructDef\n")=0D
+ NextOffset =3D -1=0D
+ for Item in self._CfgItemList:=0D
+ if Item['find'] !=3D '':=0D
+ BsfFd.write('\n Find "%s"\n' % Item['find'])=0D
+ NextOffset =3D Item['offset'] + Item['length']=0D
+ if Item['name'] !=3D '':=0D
+ if NextOffset !=3D Item['offset']:=0D
+ BsfFd.write(" Skip %d bytes\n" %=0D
+ (Item['offset'] - NextOffset))=0D
+ if len(Item['subreg']) > 0:=0D
+ NextOffset =3D Item['offset']=0D
+ BitsOffset =3D NextOffset * 8=0D
+ for SubItem in Item['subreg']:=0D
+ BitsOffset +=3D SubItem['bitlength']=0D
+ if SubItem['name'] =3D=3D '':=0D
+ if 'bitlength' in SubItem:=0D
+ BsfFd.write(" Skip %d bits\n" %=0D
+ (SubItem['bitlength']))=0D
+ else:=0D
+ BsfFd.write(" Skip %d bytes\n" %=0D
+ (SubItem['length']))=0D
+ else:=0D
+ Options =3D self.WriteBsfStruct(BsfFd, SubItem=
)=0D
+ if len(Options) > 0:=0D
+ OptionDict[SubItem=0D
+ ['space']+'_'+SubItem=0D
+ ['cname']] =3D Options=0D
+=0D
+ NextBitsOffset =3D (Item['offset'] + Item['length']) *=
8=0D
+ if NextBitsOffset > BitsOffset:=0D
+ BitsGap =3D NextBitsOffset - BitsOffset=0D
+ BitsRemain =3D BitsGap % 8=0D
+ if BitsRemain:=0D
+ BsfFd.write(" Skip %d bits\n" % BitsRem=
ain)=0D
+ BitsGap -=3D BitsRemain=0D
+ BytesRemain =3D BitsGap // 8=0D
+ if BytesRemain:=0D
+ BsfFd.write(" Skip %d bytes\n" %=0D
+ BytesRemain)=0D
+ NextOffset =3D Item['offset'] + Item['length']=0D
+ else:=0D
+ NextOffset =3D Item['offset'] + Item['length']=0D
+ Options =3D self.WriteBsfStruct(BsfFd, Item)=0D
+ if len(Options) > 0:=0D
+ OptionDict[Item['space']+'_'+Item['cname']] =3D Op=
tions=0D
+ BsfFd.write("\nEndStruct\n\n")=0D
+=0D
+ BsfFd.write("%s" % self._BuidinOptionTxt)=0D
+=0D
+ NameList =3D []=0D
+ OptionList =3D []=0D
+ for Each in sorted(OptionDict):=0D
+ if OptionDict[Each] not in OptionList:=0D
+ NameList.append(Each)=0D
+ OptionList.append(OptionDict[Each])=0D
+ BsfFd.write("List &%s\n" % Each)=0D
+ for Item in OptionDict[Each]:=0D
+ BsfFd.write(' Selection %s , "%s"\n' %=0D
+ (self.EvaluateExpress(Item[0]), Item[1]))=
=0D
+ BsfFd.write("EndList\n\n")=0D
+ else:=0D
+ # Item has idential options as other item=0D
+ # Try to reuse the previous options instead=0D
+ Idx =3D OptionList.index(OptionDict[Each])=0D
+ self._CfgOptsDict[Each] =3D NameList[Idx]=0D
+=0D
+ BsfFd.write("BeginInfoBlock\n")=0D
+ BsfFd.write(' PPVer "%s"\n' % (self._CfgBlkDict['ver']))=
=0D
+ BsfFd.write(' Description "%s"\n' % (self._CfgBlkDict['name']))=
=0D
+ BsfFd.write("EndInfoBlock\n\n")=0D
+=0D
+ self.WriteBsfPages(self._CfgPageTree, BsfFd)=0D
+=0D
+ BsfFd.close()=0D
+ return Error=0D
+=0D
+ def WriteDeltaLine(self, OutLines, Name, ValStr, IsArray):=0D
+ if IsArray:=0D
+ Output =3D '%s | { %s }' % (Name, ValStr)=0D
+ else:=0D
+ Output =3D '%s | 0x%X' % (Name, Array2Val(ValStr))=0D
+ OutLines.append(Output)=0D
+=0D
+ def WriteDeltaFile(self, OutFile, PlatformId, OutLines):=0D
+ DltFd =3D open(OutFile, "w")=0D
+ DltFd.write("%s\n" % GetCopyrightHeader('dlt', True))=0D
+ if PlatformId is not None:=0D
+ DltFd.write('#\n')=0D
+ DltFd.write('# Delta configuration values \=0D
+for platform ID 0x%04X\n' % PlatformId)=0D
+ DltFd.write('#\n\n')=0D
+ for Line in OutLines:=0D
+ DltFd.write('%s\n' % Line)=0D
+ DltFd.close()=0D
+=0D
+ def GenerateDeltaFile(self, OutFile, AbsfFile):=0D
+ # Parse ABSF Build in dict=0D
+ if not os.path.exists(AbsfFile):=0D
+ Lines =3D []=0D
+ else:=0D
+ with open(AbsfFile) as Fin:=0D
+ Lines =3D Fin.readlines()=0D
+=0D
+ AbsfBuiltValDict =3D {}=0D
+ Process =3D False=0D
+ for Line in Lines:=0D
+ Line =3D Line.strip()=0D
+ if Line.startswith('StructDef'):=0D
+ Process =3D True=0D
+ if Line.startswith('EndStruct'):=0D
+ break=0D
+ if not Process:=0D
+ continue=0D
+ Match =3D re.match('\\s*\\$gCfgData_(\\w+)\\s+\=0D
+(\\d+)\\s+(bits|bytes)\\s+\\$_AS_BUILT_\\s+=3D\\s+(.+)\\$', Line)=0D
+ if Match:=0D
+ if Match.group(1) not in AbsfBuiltValDict:=0D
+ AbsfBuiltValDict[Match.group(1)] =3D Match.group(4).st=
rip()=0D
+ else:=0D
+ raise Exception("Duplicated configuration \=0D
+name '%s' found !", Match.group(1))=0D
+=0D
+ # Match config item in DSC=0D
+ PlatformId =3D None=0D
+ OutLines =3D []=0D
+ TagName =3D ''=0D
+ Level =3D 0=0D
+ for Item in self._CfgItemList:=0D
+ Name =3D None=0D
+ if Level =3D=3D 0 and Item['embed'].endswith(':START'):=0D
+ TagName =3D Item['embed'].split(':')[0]=0D
+ Level +=3D 1=0D
+ if Item['cname'] in AbsfBuiltValDict:=0D
+ ValStr =3D AbsfBuiltValDict[Item['cname']]=0D
+ Name =3D '%s.%s' % (TagName, Item['cname'])=0D
+ if not Item['subreg'] and Item['value'].startswith('{'):=0D
+ Value =3D Array2Val(Item['value'])=0D
+ IsArray =3D True=0D
+ else:=0D
+ Value =3D int(Item['value'], 16)=0D
+ IsArray =3D False=0D
+ AbsfVal =3D Array2Val(ValStr)=0D
+ if AbsfVal !=3D Value:=0D
+ if 'PLATFORMID_CFG_DATA.PlatformId' =3D=3D Name:=0D
+ PlatformId =3D AbsfVal=0D
+ self.WriteDeltaLine(OutLines, Name, ValStr, IsArray)=0D
+ else:=0D
+ if 'PLATFORMID_CFG_DATA.PlatformId' =3D=3D Name:=0D
+ raise Exception("'PlatformId' has the \=0D
+same value as DSC default !")=0D
+=0D
+ if Item['subreg']:=0D
+ for SubItem in Item['subreg']:=0D
+ if SubItem['cname'] in AbsfBuiltValDict:=0D
+ ValStr =3D AbsfBuiltValDict[SubItem['cname']]=0D
+ if Array2Val(ValStr) =3D=3D int(SubItem['value'], =
16):=0D
+ continue=0D
+ Name =3D '%s.%s.%s' % (TagName, Item['cname'],=0D
+ SubItem['cname'])=0D
+ self.WriteDeltaLine(OutLines, Name, ValStr, False)=
=0D
+=0D
+ if Item['embed'].endswith(':END'):=0D
+ Level -=3D 1=0D
+=0D
+ if PlatformId is None and Lines:=0D
+ raise Exception("'PlatformId' configuration \=0D
+is missing in ABSF file!")=0D
+ else:=0D
+ PlatformId =3D 0=0D
+=0D
+ self.WriteDeltaFile(OutFile, PlatformId, Lines)=0D
+=0D
+ return 0=0D
+=0D
+ def GenerateDscFile(self, OutFile):=0D
+ DscFd =3D open(OutFile, "w")=0D
+ for Line in self._DscLines:=0D
+ DscFd.write(Line + '\n')=0D
+ DscFd.close()=0D
+ return 0=0D
+=0D
+=0D
+def Usage():=0D
+ print('\n'.join([=0D
+ "GenCfgData Version 0.01",=0D
+ "Usage:",=0D
+ " GenCfgData GENINC BinFile \=0D
+IncOutFile [-D Macros]",=0D
+ " GenCfgData GENPKL DscFile \=0D
+PklOutFile [-D Macros]",=0D
+ " GenCfgData GENINC DscFile[;DltFile] \=0D
+IncOutFile [-D Macros]",=0D
+ " GenCfgData GENBIN DscFile[;DltFile] \=0D
+BinOutFile [-D Macros]",=0D
+ " GenCfgData GENBSF DscFile[;DltFile] \=0D
+BsfOutFile [-D Macros]",=0D
+ " GenCfgData GENDLT DscFile[;AbsfFile] \=0D
+DltOutFile [-D Macros]",=0D
+ " GenCfgData GENDSC DscFile \=0D
+DscOutFile [-D Macros]",=0D
+ " GenCfgData GENHDR DscFile[;DltFile] \=0D
+HdrOutFile[;ComHdrOutFile] [-D Macros]"=0D
+ ]))=0D
+=0D
+=0D
+def Main():=0D
+ #=0D
+ # Parse the options and args=0D
+ #=0D
+ argc =3D len(sys.argv)=0D
+ if argc < 4:=0D
+ Usage()=0D
+ return 1=0D
+=0D
+ GenCfgData =3D CGenCfgData()=0D
+ Command =3D sys.argv[1].upper()=0D
+ OutFile =3D sys.argv[3]=0D
+=0D
+ if argc > 5 and GenCfgData.ParseMacros(sys.argv[4:]) !=3D 0:=0D
+ raise Exception("ERROR: Macro parsing failed !")=0D
+=0D
+ FileList =3D sys.argv[2].split(';')=0D
+ if len(FileList) =3D=3D 2:=0D
+ DscFile =3D FileList[0]=0D
+ DltFile =3D FileList[1]=0D
+ elif len(FileList) =3D=3D 1:=0D
+ DscFile =3D FileList[0]=0D
+ DltFile =3D ''=0D
+ else:=0D
+ raise Exception("ERROR: Invalid parameter '%s' !" % sys.argv[2])=0D
+=0D
+ if Command =3D=3D "GENDLT" and DscFile.endswith('.dlt'):=0D
+ # It needs to expand an existing DLT file=0D
+ DltFile =3D DscFile=0D
+ Lines =3D CGenCfgData.ExpandIncludeFiles(DltFile)=0D
+ OutTxt =3D ''.join([x[0] for x in Lines])=0D
+ OutFile =3D open(OutFile, "w")=0D
+ OutFile.write(OutTxt)=0D
+ OutFile.close()=0D
+ return 0=0D
+=0D
+ if not os.path.exists(DscFile):=0D
+ raise Exception("ERROR: Cannot open file '%s' !" % DscFile)=0D
+=0D
+ CfgBinFile =3D ''=0D
+ if DltFile:=0D
+ if not os.path.exists(DltFile):=0D
+ raise Exception("ERROR: Cannot open file '%s' !" % DltFile)=0D
+ if Command =3D=3D "GENDLT":=0D
+ CfgBinFile =3D DltFile=0D
+ DltFile =3D ''=0D
+=0D
+ BinFile =3D ''=0D
+ if (DscFile.lower().endswith('.bin')) and (Command =3D=3D "GENINC"):=0D
+ # It is binary file=0D
+ BinFile =3D DscFile=0D
+ DscFile =3D ''=0D
+=0D
+ if BinFile:=0D
+ if GenCfgData.GenerateDataIncFile(OutFile, BinFile) !=3D 0:=0D
+ raise Exception(GenCfgData.Error)=0D
+ return 0=0D
+=0D
+ if DscFile.lower().endswith('.pkl'):=0D
+ with open(DscFile, "rb") as PklFile:=0D
+ GenCfgData.__dict__ =3D marshal.load(PklFile)=0D
+ else:=0D
+ if GenCfgData.ParseDscFile(DscFile) !=3D 0:=0D
+ raise Exception(GenCfgData.Error)=0D
+=0D
+ # if GenCfgData.CheckCfgData() !=3D 0:=0D
+ # raise Exception(GenCfgData.Error)=0D
+=0D
+ if GenCfgData.CreateVarDict() !=3D 0:=0D
+ raise Exception(GenCfgData.Error)=0D
+=0D
+ if Command =3D=3D 'GENPKL':=0D
+ with open(OutFile, "wb") as PklFile:=0D
+ marshal.dump(GenCfgData.__dict__, PklFile)=0D
+ return 0=0D
+=0D
+ if DltFile and Command in ['GENHDR', 'GENBIN', 'GENINC', 'GENBSF']:=0D
+ if GenCfgData.OverrideDefaultValue(DltFile) !=3D 0:=0D
+ raise Exception(GenCfgData.Error)=0D
+=0D
+ if GenCfgData.UpdateDefaultValue() !=3D 0:=0D
+ raise Exception(GenCfgData.Error)=0D
+=0D
+ # GenCfgData.PrintData ()=0D
+=0D
+ if sys.argv[1] =3D=3D "GENBIN":=0D
+ if GenCfgData.GenerateBinary(OutFile) !=3D 0:=0D
+ raise Exception(GenCfgData.Error)=0D
+=0D
+ elif sys.argv[1] =3D=3D "GENHDR":=0D
+ OutFiles =3D OutFile.split(';')=0D
+ BrdOutFile =3D OutFiles[0].strip()=0D
+ if len(OutFiles) > 1:=0D
+ ComOutFile =3D OutFiles[1].strip()=0D
+ else:=0D
+ ComOutFile =3D ''=0D
+ if GenCfgData.CreateHeaderFile(BrdOutFile, ComOutFile) !=3D 0:=0D
+ raise Exception(GenCfgData.Error)=0D
+=0D
+ elif sys.argv[1] =3D=3D "GENBSF":=0D
+ if GenCfgData.GenerateBsfFile(OutFile) !=3D 0:=0D
+ raise Exception(GenCfgData.Error)=0D
+=0D
+ elif sys.argv[1] =3D=3D "GENINC":=0D
+ if GenCfgData.GenerateDataIncFile(OutFile) !=3D 0:=0D
+ raise Exception(GenCfgData.Error)=0D
+=0D
+ elif sys.argv[1] =3D=3D "GENDLT":=0D
+ if GenCfgData.GenerateDeltaFile(OutFile, CfgBinFile) !=3D 0:=0D
+ raise Exception(GenCfgData.Error)=0D
+=0D
+ elif sys.argv[1] =3D=3D "GENDSC":=0D
+ if GenCfgData.GenerateDscFile(OutFile) !=3D 0:=0D
+ raise Exception(GenCfgData.Error)=0D
+=0D
+ else:=0D
+ raise Exception("Unsuported command '%s' !" % Command)=0D
+=0D
+ return 0=0D
+=0D
+=0D
+if __name__ =3D=3D '__main__':=0D
+ sys.exit(Main())=0D
--=20
2.28.0.windows.1


Re: [PATCH v7] IntelFsp2Pkg: Add Config Editor tool support

Chiu, Chasel
 

Reviewed-by: Chasel Chiu <chasel.chiu@intel.com>

-----Original Message-----
From: Loo, Tung Lun <tung.lun.loo@intel.com>
Sent: Thursday, June 24, 2021 4:38 PM
To: devel@edk2.groups.io
Cc: Loo, Tung Lun <tung.lun.loo@intel.com>; Ma, Maurice
<maurice.ma@intel.com>; Desimone, Nathaniel L
<nathaniel.l.desimone@intel.com>; Zeng, Star <star.zeng@intel.com>; Chiu,
Chasel <chasel.chiu@intel.com>
Subject: [PATCH v7] IntelFsp2Pkg: Add Config Editor tool support

This is a GUI interface that can be used by users who
would like to change configuration settings directly
from the interface without having to modify the source.

This tool depends on Python GUI tool kit Tkinter.
It runs on both Windows and Linux.

The user needs to load the YAML file along with DLT file
for a specific board into the ConfigEditor, change the desired
configuration values. Finally, generate a new configuration delta
file or a config binary blob for the newly changed values to take
effect. These will be the inputs to the merge tool or the stitch
tool so that new config changes can be merged and stitched into
the final configuration blob.

This tool also supports binary update directly and display FSP
information. It is also backward compatible for BSF file format.

Running Configuration Editor:
python ConfigEditor.py

Co-authored-by: Maurice Ma <maurice.ma@intel.com>
Cc: Maurice Ma <maurice.ma@intel.com>
Cc: Nate DeSimone <nathaniel.l.desimone@intel.com>
Cc: Star Zeng <star.zeng@intel.com>
Cc: Chasel Chiu <chasel.chiu@intel.com>
Signed-off-by: Loo Tung Lun <tung.lun.loo@intel.com>
---
IntelFsp2Pkg/Tools/ConfigEditor/CommonUtility.py | 504
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
++++++++++++++++++++++++++++++++++++++++++++++++++
IntelFsp2Pkg/Tools/ConfigEditor/ConfigEditor.py | 1499
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++