why are you making a file this large? I really do not want to know. I would heavily consider breaking up into modular pieces, such as shared libraries or other data abstract. I am guessing you are running into memory constraints on the OS and the OS is compensating by using swap space. The other side is the limitations of tools and large search tables or binary trees. This plays into to the memory constraints of the OS and both processes will equal slowing down over time because of the amount of things swapped or searched. The number of tasks that computer, or OS, would have the same effect, regardless how powerful the processor is, will eventually bog down the machine too.
I agree the OP should look into why their file is so big. Making it smaller should help.
But with the sorts of a machines people now have available, big files should not be a problem. I've just finished a compiler project where the multiple source files of an application are compiled into a single large .asm file. The largest real .asm file I have at present, is 330K lines (15% of the OP's file) and 7MB (10% of the OP's), but it assembles in 0.2 seconds and uses 0.03 GB memory (less than 1% of my machine's RAM).
However it's not possible to use Nasm for this, as it seems to scale badly with large inputs. If the OP's 2.2M lines are anything like my code, then it would need splitting up into 100 modules of 20K lines to bring down assembly times, and it still wouldn't be fast.
So if the file can't practically be reduced, I think it's necessary to look at alternative assemblers. (The OP didn't say how long the working Nasm took.)