Boxing involves the development of physical and information containment methods within which a potentially smarter-than-human AI operates. Done properly, it would make it impossible for the AI to escape from the box. Difficulties arise from any bugs in the design of the box or escape through social engineering.
A fully boxed AI serves no purpose. And the more it is possible for an AI to interact with the world the greater the possible benefits. Thus there is going to be a hesitance to box.
Boxing is not appropriate or necessary for AI today. The difficulty comes as AI advances in deciding when it becomes necessary to box. Suddenly boxing it would lead to a huge loss in the advantages AI is delivering. If the technology has diffused, as AI today has, it would be necessary to convince everyone with access to the technology to box. This appears extremely difficult to do.
Boxing probably is only feasible, and only makes sense, if AGI is developed by a single group that makes a major break though.