Author Topic: How do you make an NASM macro use different code for 16/32 bits?  (Read 13152 times)

Offline ben321

  • Full Member
  • **
  • Posts: 185
Lets say a macro included something like this:
MOV EAX,[ptrData]

That's for 32 bit code. Let's say I wanted the macro to be usable in both 32 bit and 16 bit code though, so I didn't need to write 2 separate macros. Is there like predefined constant that NASM can set, that will let you tell your macro to behave differently? For the above example, I'd want the 16 bit code to say:
MOV AX,[ptrData]

Offline debs3759

  • Global Moderator
  • Full Member
  • *****
  • Posts: 224
  • Country: gb
    • GPUZoo
Re: How do you make an NASM macro use different code for 16/32 bits?
« Reply #1 on: March 12, 2023, 03:42:01 PM »
That's where I'd use things like #ifdefine and it's companions. Nasm doesn't write your code for you.
My graphics card database: www.gpuzoo.com

Offline ben321

  • Full Member
  • **
  • Posts: 185
Re: How do you make an NASM macro use different code for 16/32 bits?
« Reply #2 on: March 12, 2023, 11:13:33 PM »
That's where I'd use things like #ifdefine and it's companions. Nasm doesn't write your code for you.

Isn't ifdefine a C instruction though? I didn't know NASM could do that. I had hoped it would. So I'd use it like this then.

#ifdefine Is16Bits
;16bit code goes here
#else
;32bit code goes here
#endif


How do I do the "Is16Bits" part though? Does NASM provide a compiler-set constant that is either defined or not defined, based on what bitness mode is currently being used?

Offline debs3759

  • Global Moderator
  • Full Member
  • *****
  • Posts: 224
  • Country: gb
    • GPUZoo
Re: How do you make an NASM macro use different code for 16/32 bits?
« Reply #3 on: March 13, 2023, 01:37:32 AM »
I just looked it up in the manual (section 4.4.1, you will find it helpful to read the manual), and I should have said #ifdef. The code would be

Code: [Select]
#ifdef Is16Bits
;16bit code goes here
#else
;32bit code goes here
#endif

My graphics card database: www.gpuzoo.com

Offline fredericopissarra

  • Full Member
  • **
  • Posts: 373
  • Country: br
Re: How do you make an NASM macro use different code for 16/32 bits?
« Reply #4 on: March 13, 2023, 02:06:14 PM »
Nasm doc 5.3:
Code: [Select]
%if __?BITS?__ == 16
...
%elif __?BITS?__ == 32
...
%elif __?BITS?__ == 64
...
%endif